ESD is 15 ? No, It's 20 Years Old


Article de Jack Ganssle dans sa lettre d'information
Embedded Muse 154 Copyright 2008 TGG January 28, 2008


Embedded Systems Design is now 20 years old. They asked me to write a
retrospective of those 20 years, which is here. But it was just
yesterday, as I recall, that the magazine turned 15. At that time I
wrote a personal history of the embedded landscape which follows. If
anyone else, especially old-timers, want to share their perspectives,
please do. If you love the history of this field as I do, and are ever
in Mountain View, CA, be sure to visit the Computer History Museum.


If there's a dweeb gene I got a double dose. Computers fascinated me
from early childhood. Yet in the 60s none of us had access to these
horrendously expensive machines. Dweebs built ham radios and vacuum
tube "hi-fi" gear instead. I had a tiny lab in the basement filled
with surplus electronic equipment that was constantly being
reconstituted into new devices. One "personal computer" existed:
Heathkit's EC-1 analog computer
was a $200 monster comprising just 9 vacuum tube op amps. Designed for
simulating differential equations, users programmed it by wiring the
amplifiers together using patchcords, resistors and capacitors. My
lust for it remained unsated due the impossible cost.

At age 13 I was expected to spend Saturdays doing free janitorial
duties for my dad's underfunded startup. The usual grumbling ceased
when I discovered the other half of the deal: components swept from
the engineering lab's floor were mine. Even better, those engineers
were slobs who dribbled parts faster than infants lose their rattles.
Resistors, transistors, and even digital ICs started filling the parts
bins at home, and far too many hours spent wiring all sorts of
circuits taught me the nature of each device.

A lucky break at 16 landed a $1.60/hour job as an electronics
technician. While Neal Armstrong frolicked across the Sea of
Tranquility we built ground support equipment for Apollo and other
programs. Exciting? You bet! Feeling flush with cash I paid $4 an hour
for dial-up access to a Honeywell mainframe, my first exposure to any
sort of computer. An ASR-33 Teletype at a friend's school gave us
time-shared Fortran at 110 baud.

But surely there was a way to get my own computer! All attempts to
build one failed, doomed by little money and less knowledge. So little
that most of these designs accepted simplified Fortran as machine
language since I'd never heard of assembly, let along machine, code.
And any sort of memory was simply not available to a high school kid.

Senior year two of us hitchhiked from DC to Boston to visit an
electronics surplus store. There I bought a 13,000 bit core memory
box. No drivers, no electronics, just 26 planes of 512 bits arrayed in
X-Y matrices.

Was it our long hair? Maybe the three unheeded warnings to get off
the New Jersey Turnpike contributed. Gary and I found ourselves in a
New Jersey jail cell, nailed for hitching. The police were surprised
to discover the core instead of drugs in our backpacks. "What's this?"
the chief growled. I timidly tried to convince him it was computer
memory, the last thing he expected to find on a pair of hippie-freaks.
They eventually let us go, me still clutching the core box. 37 years
later it sits on my desk, a reminder of both long-lost youth and the
breathless pace of technology. Picture here
.

I became a reluctant college student, reluctant only till I
discovered the university's Univac 1108, a $10 million mainframe
packing about as much power as one of today's ubiquitous calculators.
Class attendance became a last option before exams, pushed to the
bottom of the priority list after work and all night computing. As I
learned the secrets of assembly language and operating system
vulnerabilities the limits of the usual $50/semester account were
easily transcended.

The 1108 ran at 1.3 MHz, had 768K (not megs) of core memory, and used
2 ton six foot long spinning drums each storing 45 Mb. Punched cards
were the GUI of that age. The machine had a ridiculous instruction set
lacking even a stack, an awkward 36-bit word, and the annoying habit
of crashing more often than Windows 2.0, especially when final
projects were due. But I loved that computer, and wasted absurd
amounts of time developing tools and applications for it.

To stay abreast of the latest in electronics I wrangled a free
subscription to EDN, probably offered due to considerable exaggeration
about my title. Pathetically it was more fascinating to me than the
Playboys my friends hoarded. In 1971 the magazine announced the
creation of a computer on a chip. Chips never had more than a few
hundred transistors - was this a hoax?

The "computer on a chip" was more marketing hype than reality, as the
4004 required a tremendous amount of external support circuitry. But
it was a huge advance in computing. Since no dreamer wanted a 4-bitter
as a home PC the only market it targeted was as the as yet unnamed and
almost non-existent one of embedded systems. Intel invented not only
the microprocessor, but for all intents and purposes the entire notion
of cheap embedded systems. Yet according to The Microprocessor: A
Biography (Michael S. Malone, 1995, Springer-Verlag, NY ISBN
0-387-94342-0) the company didn't at first understand the implications
of this new innovation - they were afraid the yearly microprocessor
market was a mere 2,000 chips!

Meanwhile I'd finally figured out the secrets of computers and built
a 12-bit machine that actually worked. It used hundreds of TTL ICs -
no microprocessor - wired on vectorboard, using brightly colored
telephone cable soldered directly to each chip's pins. I couldn't
afford sockets or wire-wrap wire. My Heathkit scope helped
troubleshoot the logic, but since the computer was fully static it
could even run at a fraction of a hertz. A simple voltmeter could
follow bits flipping. Once the logic worked I cranked up the clock to
100 KHz or even 1 MHz, depending on how adventurous I felt. Memory was
768 words of semiconductor RAM (36 of Intel's 1101 256 bit static
RAMs) and 256 words of 1702A EPROM.

$50 procured a World War II vintage ASR-15 teletypewriter capable of
50 baud communications. Powered by a half-horse motor it must have
weighed 200 pounds. The noise that beast made was indescribable.
The neighbors in the apartment below sure learned to hate it.
It spoke BAUDOT, sort of a 5-bit version of ASCII.
Eventually I managed to get a bootloader working in EPROM (using a
bit-banging UART), and then wrote a monitor program that was loaded
from paper tape.

By the time the machine worked it was utterly obsolete. For in 1972,
Intel released the 8008, the first 8 bit microprocessor. Like its
predecessor, this part was at sea unless surrounded by lots of support
circuitry. But the device, in an 18-pin package, offered what seemed
like a vast 16k address space. It used three power supplies (+12, +5,
and -9) and two twelve-volt clocks.

Panic set in at the company where I still worked as a technician. The
8008 made decent amounts of computing available for a reasonable
price. It meant we could build a new kind of device, a machine that
analyzed oil coatings on nylon, which needed far more intelligence
than possible via hardwired logic. The problem? None of the engineers
know how to program.

A consultant saved the day, writing thousands of lines of PL/M and
creating one of the very first production embedded systems. But he was
expensive and the company broke. Work had started on another product,
one that used infrared light to measure the amount of protein in
wheat. Someone figured out that I knew assembly language, so was
suddenly promoted to engineer. Classes, on those rare occasions I
showed up, seemed utterly irrelevant compared to the thrill of messing
with the machines.

The first of these grain analyzers used an 8008 with 4k of 1702A
EPROMs. 4k doesn't sound like much, but required sixteen - sixteen! -
256 byte chips occupying an entire PCB. The CPU ran at a blistering
800 KHz, the device's max rate, and just about the same speed as the
by now forgotten Univac 1108. Which was still at the same school
noisily reminding me about upcoming finals.

Intel provided a rudimentary development system called the Intellec 8
which had a single hardware breakpoint set by front-panel switches.
Its only I/O device was an ASR-33 TTY, a 10 character
per second unit incorporating a paper tape reader and punch. No disks,
no mass storage, no nuthin'.

To boot the Intellec 8 we'd manually load a jump instruction into
memory via front panel switches and press "run". A bootloader then
read the editor in from paper tape. A crude very GUI-less editor let
us enter our source code and correct mistyping (lots - the ASR-33 was
hardly finger-friendly). The final edited source was punched to paper
tape. These tools were all buggy; once it output my hours of typing
completely reversed, printing the last character first and the first
last. I started keeping a bottle of Mylanta at hand.

The cruddy tools gave us plenty of incentive to modularize, so even a
small program comprised many separate source tapes.

The 10 CPS tape reader needed a couple of hours to load the
assembler, which then read each source tape three times, once for each
of its passes. Often - oh how often - the mechanical reader missed a
zero or one (hanging chad isn't new) so one of the three reads
wouldn't be identical to the others. Phase error - start all over! If
fate smiled and all the stars of the Zodiac aligned the machine
punched a binary relocatable tape.

Next load the linker (another hour or two), feed the binary tapes
twice each, and if everything worked perfectly it spat out a final
absolute binary image of our program.

Our 4k program probably consisted of 5000 source lines. A complete
build took three days.

Needless to say we rarely reassembled. Each morning we'd load the
current binary image and start debugging. Errors were fixed by
changing the machine code of the offending instructions. Sometimes the
fix was shorter than the code in memory so we'd drop new code in place
of the old, padding unused bytes with NOPs. Longer patches went into
unused RAM, accessed via a JMP plopped on top of the offending code.
Careful notes on the listing logged each change so a later edit could
make the source match the code. A day of debugging might result in
quite a few patches, preserved by punching a tape of the current
binary image. The next day we'd load that tape and continue debugging.

That 4k of code did very sophisticated floating point math, including
noise reduction algorithms, least squares curve fits, and much more.
It was a huge success that led to many more embedded products.

Intel's 1974 introduction of the 8080 wasn't much of a surprise. The
shocker was an article in Popular Electronics about a home computer
called the Altair 8800,
an 8080 machine that sold for $400 in kit form. Not much memory
came with it for that price, but since the processor alone sold for
$400 engineers couldn't imagine how MITS pulled off this miracle. At
the time we imagined they used reject parts, but learned later the
company bought CPUs in high volume for $75 each.

The next generation of our product used an 8080, so we bought a pair
of Altairs as development platforms. With two machines we could
cannibalize boards to keep one working. most of the time. The Altair
was designed to meet a price, evidenced by poor PCB layout and
unreliable DRAM circuitry. Constant crashes and data loss were the
norm. Tagemet became available about this time and prescriptions for
the drug littered the lab benches.

After much complaining and more loss of time the boss relented and
spent $20,000 (equivalent to $79k today) on Intel's new MDS-800. It
was a general purpose 8080 computer that included an editor,
assembler, linker, various other software tools, and the very first
in-circuit emulator. A huge device, it became familiarly known in the
industry as the "Blue Box". Mass storage consisted of two 8" floppy
disks holding about 80Kb each.

Though our productivity soared development times grew longer. The
8080's 64k address space removed program size constraints, so
customers and marketing started demanding ever-more features. The
programs swelled to 16k, 32k and beyond. EPROM sizes grew in step so
there were no barriers to code bloat. Except engineering time, but
even then managers chanted the mantra "it's only a software change".

Everything was still written in assembly language. We experimented
with Basic from a tiny outfit named Microsoft, and their later
Fortran. Neither was adequate for real-time embedded work. No Cs
existed for micros at the time. I created an Algol-like pseudo code
implemented in assembly macros, but the MDS-800 took over an hour to
translate a little 100-line program.

But assembly language was fun, if we could find programmers. There
were never enough. For now the company was in a growth spurt and every
product used micros. As is normal in times of turmoil the Peter
Principle kicked in and I found myself in charge of all digital design
and firmware. A growing staff barely kept pace with the demand for new
products. We had our own offices and lab; those and our youth, the
music, and odd hours divided us from the older analog folks. At least
they seemed old; in retrospect none could have passed 30.

Processors were still very expensive. CPU chips cost hundreds of
dollars. In 1975 MOS Technology introduced the 6502 at the astonishing
price of $25. That sparked the Jobs/Wozniak whiz kids to develop the
Apple computer. I bought one of these chips thinking to make a home
machine. But for over a year home was a VW microbus, usually parked
outside the office or on a rest stop on Route 95. After a Canadian
vacation I reentered the US at a remote Maine town, hoping to take in
the sights of the great North Woods. In an incident eerily echoing the
NJ Turnpike bust customs agents, seeing the long hair and the
microbus, stripped the van. They found the 6502 in the glove
compartment. Here too, the officers were unaware of the pending
microprocessor revolution, and were disbelieving about my story about
a computer on a chip. Sure, kid. They didn't know what that 40 pin DIP
was, but it sure looked like contraband.

We followed the evolution of technology, moving to the 8085 when it
came out, and then, looking for much more horsepower to run the
graphical displays our customers demanded, designed a system using
AMD's 2901 bit-slice components. These were 4 bit elements strung
together to create processors of arbitrary word length with custom
instruction sets. A Signetics 8X300, a wacky DSP-like processor using
16 bit instructions but 8 bit data words, sequenced interactions
between the bit-slice device and a Nova minicomputer.

We still flirted with minicomputers, searching on some products for
more horsepower than possible with a micro. And some of these devices
had to run Novalegacy code. This DataGeneral 16 bitter offered decent
performance for much less than themore popular and wonderfully orthogonal PDP-11.

The original Nova 1200 moved data through a single 4-bit ALU, using
four cycles to do any arithmetic operation. It's hard to imagine in
this day of free transistors but there was a time when hardware was
expensive. Later models used full 16 bit data paths.

All had non-volatile core memory. Data General was very slow to
provide ROMed boot code, so users were expected to enter the loader
via the front panel switches. We regularly left the Nova's boot loader
in a small section of core. My fingers are still callused from
flipping those toggle switches tens of thousands of times, jamming the
same 30 instructions into core whenever a program crashed so badly it
overwrote the loader. At first all of the engineers got a kick out of
starting up the Nova, flipping switches like the captain of some
exotic space ship. Later we learned to hate that front panel. It's
time to enter those instructions AGAIN! Acid indigestion gave way to a
full ulcer.

We put in insane hours.100 hour weeks for 40 hours pay wasn't
unusual, but there was little complaining. The technology was so
fascinating! But by 1976 I was living on an old wooden sailboat and
was rich - or so it seemed, with the amazing sum of $1500 in the bank.
I quit to sail around the world.

And sank a year later. Back to the same job, but now I'd felt freedom
and was itching for something more exciting. I resigned and started a
consulting outfit with a good friend. For two years we built custom
embedded systems for a variety of customers. A few stick out in memory
- like the security system for the White House, which used over a
hundred tightly-coupled 8-bit CPUs. When the contract ended I lost my
White House pass the same day as Ollie North, though with much less
fanfare.

We built a variety of deep ocean probes that measured O2,
temperature, salinity, currents, and other parameters. These had to
run for months to years on small batteries, so used RCA's 1802, at the
time the only CMOS processor. It was a terrible chip lacking even a
call instruction, but sipped so sparingly from Vcc that it hardly
needed a power switch. Later we built a system that also used an 1802
to measure how fruit ripens while being shipped across oceans. That
project was doubly rewarding when stevedores in Rotterdam dropped a
shipping container on the device. The replacement job paid the bills
for another month or two.

A 12-ton gauge that moved on railroad tracks as it measured the
thickness of white-hot steel used a PDP-11 minicomputer interfaced to
various 8-bit microprocessors. The plant's house-sized main motor
reversed direction every few seconds to run the steel back and forth
under rollers, tossing staggering amounts of RFI into the air. Poorly
designed cabling could quite literally explode from coupled EMF. We
learned all about shielding, differential transmission, and building
smart software to ignore transients.

In those two years we starved, never having learned the art of
properly estimating the cost of a job. Both of us agreed the
friendship was more important than the company, so we sold the assets
and each started our own outfits. That proved wise as we're still very
close, and today our sons are best friends.

I'd had it with consulting. With every project the consultant more or
less starts from scratch. A product, though - that seemed the ticket.
Design it once and sell the same thing forever. But cash was scarce so
I consulted during the day and wrote proprietary software at night.

The result was MTBASIC, a Basic compiler for the Z80 that supported
multitasking. For a development platform I built a Z80 CP/M machine
using a 40 character-wide TV monitor and a single floppy disk whose
controller was a half-PCB of discrete logic rather than the fancy but
expensive FDC chips of the time.

The compiler, targeted at embedded apps, was interactive like an
interpreter yet produced native compiled code that could be ROMed.
Compile times were nearly instantaneous and the generated code even
faster. Built-in windowing and a host of other features drove the
source to over 30k lines of assembly. But this compiler was the cutest
code I ever wrote. Working out of the house I managed to sell some
10,000 copies for $30 each over the next few years.

In 1981, IBM introduced the PC. Using a 4.77 MHz 8088 and limited to
640k of RAM, this machine caused the entire world to take notice of
the microprocessor industry. Though plenty of "personal computers"
already existed, these were mostly CP/M based Z80 models that required
enormous techie competence and patience. The one exception was the
Apple ][, but that only slowly made its way into the business world.

Though a very healthy and dynamic embedded industry existed, then as
now it was mostly invisible to the average Joe. Few smart consumer
products existed, so Joe's perception of computers was still the
whirling tape drives in sci-fi programs or the ominous evil of HAL in
the movie "2001- A Space Odyssey". But the IBM PC brought computers
into the mainstream. Normal people could own and master these
machines. Or so they're still telling us.

I bought an early PC. Unbelievably, floppies were optional. Most
customers used cassette tapes. My two floppy model with 256Kb RAM cost
$7000. I ported MTBASIC to the PC, recoding it in 8088 assembly, and
found a willing market.

The top floor of the house was devoted entirely to offices now, the
main level for storage. We moved into the basement. Neighbors
complained about daily delivery trucks. Ceiling-high stacks of manuals
and pallets of shipping boxes in the living room made entertaining
challenging, or at least quirky.

Despite brisk sales, advertising ate all the profits. Still
consulting, a government customer needed a battery-operated data
collection system. National's NSC800 was ideal, but since tools didn't
exist for the CPU it seemed natural to make a simple little ICE, which
worked surprisingly well.

Eventually that Eureka moment hit - why not sell these emulators?
Since the NSC800 was so similar to the Z80 and 8085 it was a snap to
expand the product line.

Some customers built astonishing control systems with MTBASIC. But
the C language slowly gained acceptance in the embedded space and
Microsoft's various flavors of Basic ate away at our non-embedded
market. The PC killed off CP/M, obsoleting that version of the
compiler. As the product's sales slipped, though, ICE revenues more
than compensated.

The hardware design was simple, using only 17 ICs. The emulation
processor was also the ICE control CPU. A bit banging UART
complexified the software but saved a chip or two. But the firmware -
oh my god, the firmware was a nightmare. And more fun than you can
imagine. For on a breakpoint the CPU had to store the entire context
of the executing program. Since I'd made the ridiculous decision to
use no target system resources, when transitioning through a
breakpoint the hardware had to swap in local ICE RAM and turn off user
memory. State-saving PUSHes stashed information wherever the user's
stack pointer had been - anywhere in the ICE's address space. Hardware
stripped off some of the address bits to insure the data went into the
emulator's RAM, but the need to minimize chip count meant it was usual
for the writes to wipe out local variables used by the ICE.
Reconstructing the data while correctly preserving the target context
was quite a challenge. It was a cool design, though probably should
have used more hardware to simplify the code.

We sold the units for $595 each. Though parts and labor only ran
about $100, advertising and overhead burned cash at a scary rate. But
the business grew. Forced out of the house by space needs, we rented a
facility, the first of many as growth demanded ever-more square
footage.

Over time I learned the basic law of the embedded tool market: keep
prices high. Every application is truly unique so customer support is
hugely expensive. Support costs are about the same for a $600 or $6000
tool. That's why today a simple BDM, which might use only a few
dollars of parts, can cost thousands. And why Linux is free but plenty
of outfits will happily drain your fortunes to help get it going. So
we developed much more powerful units for prices up to $10k, keeping
hardware and production costs around $1k.

Those were the glory days of emulators, when chip companies funded
new ICE designs and customer demand was high. Our product line grew to
include many 8 and 16 bit processors. The emulators themselves became
hugely complicated, stuffed with boards crammed with very high-speed
logic, memory, FPGAs and PLDs. 4 MHz processors accelerated to 8, then
12, and to 40 MHz or more. Even a few nanosecond delay imposed by a
single chip ate an unacceptable 20% of the bus cycle, so more exotic
technology placed closer to the customer's target CPU socket became
the norm. Logic design morphed into high-speed RF work. Maxwell's
laws, at first only vaguely remembered from those oft-skipped college
electromagnetics classes, were now our divine guidance. Firmware
content skyrocketed. We used plenty of C, yet the emulators needed
vastly more assembly than most products, as plenty of very low level
bit twiddling was required.

Worn down by 70-hour weeks and the toll on my personal life I sold
that company in 1996. Yet in many ways the tool business is the best
of the embedded world. I met so many fascinating developers and poked
deeply into their intriguing projects. Some used 8-bit processors to
control fleets of aircraft while others had 32 bitters loafing along
handling very slow inputs. A few ran for years off two AAs while
others sucked from a 5 volt firehose. Applications varied from the
absurd to the most noble imaginable.

After 35 years in this industry there are times I despair for its
future. How can mere humans cope with million-line-plus programs? Is
firmware quality an oxymoron? Will engineering jobs migrate at light
speed around the world in pursuit of lowest possible costs?

I've learned, though, that the embedded revolution is one of the
greatest outcomes of a troubled 20th century. No industry is untouched
by our work. A flake of silicon reduces power plant emissions by
orders of magnitudes, smart pumps irrigate sustenance farms in Nepal,
and electronics in an automatic external defibrillator turn a Good
Samaritan into a veritable cardiac surgeon.

In those Dilbert moments when the Mylanta isn't strong enough, if
therapy or kicking the dog seem the only hope of getting through
another day, take pride in your profession. We have profoundly changed
the world, mostly for the better.

And that's a pretty darn good legacy.




                                                    signature Kack


                                                    Jack Ganssle





cliquez ici pour aller sur son site de conseil

 
 
 


 
retour vers la page Histoire    retour vers la page Histore Date de dernière mise à jour : 22/08/2016