Friday, January 6, 2017

Raspberry Pi


Do you mind if I geek out for a bit? Thank you.

During my mission, you know, that two years where I had all the time in the world to think about all the things that I had done wrong, listen to leaders that told me what I was doing wrong at the time, and then have random strangers reject and tell me how wrong my most personal beliefs were, (sorry. I digress) I looked forward to getting back to school and finish my engineering degree. I loved the idea of understanding the world, building and using tools, and all kinds of engineer stuff. I was getting tired of looking at something and guessing what made it work because I didn't have that information available. 

One of the joys of engineering classes, over, say, technology classes, is that you go into the why and how, not just the application. Over the process of my 5 years for my B.S., I learned how to build a computer from sand. We made the wafers, we made the circuits, we built the state machines, CPUs and memory and we learned how and why to program them. There was little that we didn't investigate. That isn't to say we completely understood. Each of these areas had sub areas and people had devoted their lives to particular aspects of everything we studied. That wasn't lost on me, so please don't think I'm arrogant in what education I have. I am acutely aware of my own ignorance.

So, one class I had was machine level programming. We had a board that was about a foot square. It had a small keyboard for Hex input 0-9 A-F, an LED display (Red LEDs, not LCDs), and bare chips. It was an 8086 processor on board with a whopping 16K of memory. With this, we had to build programs and become familiar with how data was stored and processed with the lowest level of programming. We had to be aware of each bit and where it was and where it flowed. It was a chore to work on. Plus, Intel chips did everything backward so we had to take that into account. Bytes were hard to read. It made me really dislike Intel's chips.

In keeping with my program's "make them suffer and then show them an easier way" paradigm, we were moved to a lab with Apple computers. Apple used Motorola's chips, the 64K series. There were emulators that allowed us to create low-level programs on the machine and we could see where the data was in memory, in the registers and all the other places data moved around. It was so much easier to understand than Intel's chipset that I fell in love with the stupid thing. It eventually landed me a job with Motorola, where I work to this day. My interview there consisted of "Can you read assembly on the 64k?" and my response was probably along the lines of <pupils dialated> " I LOVE the 64k! What a terrific instruction set!" My main accomplishment during that period was solving a memory leak in low-level code that plagued a product for over a decade. The product used a Real-Time Operating System that passed messages back and forth to the processes via a queue and occasionally, it forgot a message and the memory never cleared. That was the "memory leak."

So here we are 22 years later and I order the computer above. Yes, it is a full blown computer. $50 on Amazon. The drive is 16G, not out of line with the servers you can get from Amazon or Azure for VMs. The OS is a version of Linux. You simply connect a keyboard, mouse and monitor and boom. you have a computer. When the first version came out, I had read about it and thought that it would change the world. It is that accessible. Now we can buy cell phones with the same specs, but they aren't completely accessible in that it isn't fully interactive. Still, this thing is amazing. It is essentially that foot square board I worked on but it has a quad-core ARM chip with 1G onboard and whatever SD card I can put in. I can access the internet and program the board to run peripherals. For me, it is an amazing tool. 

I hope I never get tired of being amazed by what mankind can do with what we have learned over the millennia. It keeps me from being depressed by the stupidity that mankind is still capable of.


No comments:

Post a Comment