Blue Skies in Camelot (Continued): An Alternate 80s and Beyond

Speaking of predators being caught early... @President_Lincoln do you see anyway that The Roman Catholic Church ITTL can handle their widespread child abuse problem better than IOTL or at least have it be exposed much quickly?

(I feel like I've asked a similar question like this before so if this question sounds repetitive I am sorry.)
No need to apologize. :)

As a (semi-lapsed but still believing) Catholic Man myself, I can say that this issue is near and dear to my heart. In general, I believe that with The So Have I Movement ITTL, if handled correctly, could open the door to earlier public attention and awareness than IOTL. I also believe that this issue, while absolutely systemic within The Roman Catholic Church (which does specifically need to be addressed by the church), is also prevalent in Protestant Churches and other organizations (Boy Scouts, etc.) that involve volunteers interacting with (and being given the opportunity to abuse) children as well. The real solution, in my opinion, is to call out this problematic behavior society-wide and make it clear while this is a Catholic problem, it is also very much a wider systemic issue.
You're absolutely right, it's not a problem limited to The Roman Catholic Church. I think every religious organization has to deal with a problem like this at one point or another with widespread child abuse and of course The Boy Scouts and other organizations are just some examples. So you are right that it's a wider systemic issue.

It makes me wonder how President RFK would react to it?
Him and the rest of the Kennedy Family.
I think in a way they'd all be shocked and disappointed. Bobby would probably be most affected since he's you could argue the most religious out of them. I don't know how devout Ted is but I'm sure he'd have the same reaction. While Jack and Jackie, I don't know what would be their reaction. I mean I know that Jack has somewhat redevouted himself since the assassination again. All in all I'd say they'd all be hurt one way or another.
Absolutely. I agree that Bobby would probably hit the hardest. All of them will eventually come out in support of the victims and call upon the church to investigate and bring those responsible for these crimes to justice. I could see President RFK, the most well known Catholic in the country, coming out in support of the victims as a pretty big moment in terms of optics.
Now that you've mentioned this issue, I'm hoping that The Roman Catholic Church ITTL can handle and face this situation than IOTL. As a Catholic myself, I'm also mad that they're doing a bare minimum when it comes to child abuse while the church have done a lot of bad reputation in the past like The Spanish Inquisition (which became a running joke in Monty Phyton's Flying Circus and a Broadway-like musical by Mel Brooks in History of the World Part I). I even watched Monty Phyton's The Life of Brian (now known as Monty Phyton's The Gospel According to St. Brian ITTL) because it's funny.

Here in The Philippines during The Spanish Period, the church and the state were inseparable for more than three centuries. Even our country's national hero Jose Rizal (who have lived between 1861-1896) wrote his famous novels Noli Me Tangere (Touch Me Not) and El Filibusterismo (The Reign of Greed) that satirized the church and later started The 1896 Philippine Revolution until The 1898 Philippine Independence from Spain. His life and works became part of our school curriculum and I recommend you geniuses read those two novels that I've mentioned because they're one of The Greatest and Important Philippine Literary Pieces of All Time (IMHO).

I think the Kennedys will definitely be shocked and disappointed, with Bobby hitting the most. I'm hoping that he help those victims of child abuse, investigate their actions, and bring them to justice indeed. I don't know if he'll order the FBI or maybe the ICC to bring them into the court. What would Pope Stanislaus do now that The So Have I Movement ITTL is in early years, would His Holiness cooperate and help President RFK to act immediately, arrest people who are behind this, and prevent more crimes or not? I'll let you decide Mr. President.
 
Now that you've mentioned this issue, I'm hoping that The Roman Catholic Church ITTL can handle and face this situation than IOTL. As a Catholic myself, I'm also mad that they're doing a bare minimum when it comes to child abuse while the church have done a lot of bad reputation in the past like The Spanish Inquisition (which became a running joke in Monty Phyton's Flying Circus and a Broadway-like musical by Mel Brooks in History of the World Part I). I even watched Monty Phyton's The Life of Brian (now known as Monty Phyton's The Gospel According to St. Brian ITTL) because it's funny.

Here in The Philippines during The Spanish Period, the church and the state were inseparable for more than three centuries. Even our country's national hero Jose Rizal (who have lived between 1861-1896) wrote his famous novels Noli Me Tangere (Touch Me Not) and El Filibusterismo (The Reign of Greed) that satirized the church and later started The 1896 Philippine Revolution until The 1898 Philippine Independence from Spain. His life and works became part of our school curriculum and I recommend you geniuses read those two novels that I've mentioned because they're one of The Greatest and Important Philippine Literary Pieces of All Time (IMHO).

I think the Kennedys will definitely be shocked and disappointed, with Bobby hitting the most. I'm hoping that he help those victims of child abuse, investigate their actions, and bring them to justice indeed. I don't know if he'll order the FBI or maybe the ICC to bring them into the court. What would Pope Stanislaus do now that The So Have I Movement ITTL is in early years, would His Holiness cooperate and help President RFK to act immediately, arrest people who are behind this, and prevent more crimes or not? I'll let you decide Mr. President.
The pope may definitely do something different if RFK builds up the pressure. Guess we'll see one way or another.
 
I just binge read the entire timeline. This is one of the best alt-hist timelines I've ever read, thanks for making all of this
 
I just binge-read the entire timeline of Blue Skies In Camelot. This is one of the best alt-hist timelines I've ever read, thanks for making all of this.
Welcome to the timeline, and we're hoping for you genius to stick with us to binge-read more on Blue Skies In Camelot in the next chapter updates.
 
Just a random thought with regards to pop culture. John Carpenter's They Live (and other movies touching on similar things) would probably be a very different movie, assuming it still gets made, mostly because no Reagan means no "Greed is good" mentality.

That said, he could still make it a commentary on class, basically talking about how the rich are always the ones pulling the strings (every President since Kennedy has had what I can only describe as "Fuck you money").
 
Just a random thought with regards to pop culture. John Carpenter's They Live (and other movies touching on similar things) would probably be a very different movie, assuming it still gets made, mostly because no Reagan means no "Greed is good" mentality.

That said, he could still make it a commentary on class, basically talking about how the rich are always the ones pulling the strings (every President since Kennedy has had what I can only describe as "Fuck you money").
As a huge fan of Carpenter in general and They Live in particular, I love this idea. :D I could definitely still see him making the film and having it be a commentary on class, as you say. Maybe instead of having the focus be the propaganda aspect, it's more about the "illusion of choice" in politics.
 
Chapter 158
Chapter 158 - Shake it Up: The PC & Video Game Revolutions
lFbNyh5JJ2ELuL-vGH60tGjcVTUiQwWNnvMFR1HJGC5dSC9-ngFzTv4eOz3ZVJTe0B4bT0SoVFLy8PZg0OhqgMYhZkydKRyxGpBX36DZqhfqn6txmjMYMPs-jA29ijspouFp3CWZMDQ_U12fZozJznc
FGz7HKIpBrAgtP51OnG27hg4iRyHCFccDUXHMA-zTGOZLMyCu20yEKprRoyEhEpyJg16rDbp0JjMBBBo7sFN60uZeDa1yGY0fniDw_hYRwYG8G0RbBz2a3YgFEqM4quI6gQ_XeKZqmMObrcCNRQsnXA
Above: The IBM Personal Computer (left); the X-128 Computer along with Xerox executive Steve Jobs (right); in 1982 both would become early icons of the personal computing revolution.

“Shake it up, make a scene
Let them know what you really mean
And dance all night, keep the beat
And don't you worry 'bout two left feet
Just shake it up, oo, oo
Shake it up, oo oo, yeah
Shake it up, oo, oo
Shake it up, oh, yeah”
- “Shake it Up” by the Cars

“I think it's fair to say that personal computers have become the most empowering tool we've ever created. They're tools of communication, they're tools of creativity, and they can be shaped by their user.” - Bill Gates

“Stay hungry, stay foolish.” - Steve Jobs

The personal computing revolution, which had its origins in the microchip developments of the mid to late 1970s, exploded into life in the early 1980s.

The 1977 release of what came to be called the “Trinity” - the Commodore PET 2001, the Xerox X-2, and the Tandy/Radio Shack TRS-80 Model 1 - marked a turning point. Indeed, in the run up to the release of the “Trinity”, several firms were in fierce competition to develop and release the first truly successful commercial PC.

Chuck Peddle - an American electrical engineer at Motorola - designed the Commodore PET (Personal Electronic Transactor) around the MOS 6502 processor, which he had designed. The PET was, in essence, a single-board computer with a simple TTL-based CRT driver circuit driving a small, built-in monochrome monitor with 40×25 character graphics. The processor card, keyboard, monitor and cassette drive were all mounted in a single metal case. In 1982, Byte Magazine referred to the PET design as “the world's first personal computer”.

The PET shipped in two models; the 2001–4 with 4 KB of RAM, and the 2001–8 with 8 KB. The machine also included a built-in Datassette for data storage located on the front of the case, which left little room for the keyboard. The 2001 was announced in June 1977 and the first 100 units were shipped in mid October of that year.

Although the machine was fairly successful, there were frequent complaints about the tiny calculator-like keyboard, often referred to as a "chiclet keyboard" due to the keys' resemblance to the popular gum candy. This was addressed in the upgraded "dash N" and "dash B" versions of the 2001, which put the cassette outside the case, and included a much larger keyboard with a full stroke non-click motion. Internally a newer and simpler motherboard was used, along with an upgrade in memory to 8, 16, or 32 KB, known as the 2001-N-8, 2001-N-16 or 2001-N-32, respectively.

The PET was the least successful of the 1977 Trinity machines, with under 1 million sales.

icXnx6naTY8a9ufrgJlcA2yBlSmq3BhrmApvLEnU7OR5clJYErT5xF-vaAep8-Xzngmrrg6l4VFw8BpTyAnwFQFQRpouYhb_JHOMGWIKd97frDmQKv4kwWD8eQ-vDkzolXGYSvcdqVL0Efrw98lM0Dw

Above: The “Trinity” of early PC units: the Commodore PET 2001 (left), the Xerox X-2 (center), and the Tandy/Radio Shack TRS-80 Model 1 (right).​

Steve Wozniak (AKA “Woz”) developed the X-1 and subsequent X-2 designs based on the earlier “Alto” design, which was in turn developed at Xerox PARC (Palo Alto Research Center) in the early 1970s. The X-2 had color graphics, a full QWERTY keyboard, and internal slots for expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold separately. The high price of the X-2, along with limited retail distribution, caused it to initially lag in sales behind the other Trinity machines. However, in 1979 it surpassed the Commodore PET, receiving a sales boost attributed to the release of the extremely popular VisiCalc spreadsheet which was initially exclusive to the platform. Though it fell back to 4th place after the release of Atari’s popular 8-bit systems, the X-2 maintained “steady” sales growth. This was largely credited to its durability and longevity; it boasted a lifetime that was up to eight years longer than other machines. By 1985, the X-2 had sold more than 2.1 million units. By the time production ceased in 1993, that number had risen to 4 million. Clearly, Wozniak had designed one Hell of a computer.

UiTGrvgmKlS4qH22wQZ2c5TQOADQWE3y2nhP7Z8k-86xrbxjUTZvNAlFZHL1CIZcClfzUqGg2JV_m3IQfp-cmsaUiQwTgU3QWhiJI9dGBa0jrbZhCULHRHQ-0bgYHiZsc0jFebKC8WSESBoA37TUAGQ

Above: Steve Wozniak, lead designer of the “X-2” computer.​

Finally, the Tandy Corporation (better known as Radio Shack) introduced the TRS-80, which would be retroactively known as the Model I as the company expanded the line with more powerful models. The Model I combined the motherboard and keyboard into one unit with a separate black-and-white monitor and power supply. Tandy's more than 3,000 Radio Shack storefronts ensured the computer would have widespread distribution and support (repair, upgrade, training services) that neither Commodore nor Xerox could.

Despite this huge capacity for sales, however, the Model I suffered from myriad technical difficulties. For one thing, it could not meet FCC regulations on radio interference due to its plastic case and exterior cable-design. There were also internal problems. Keystrokes would randomly repeat at times. The earliest versions of the hardware produced bizarre glitches. Though these were promptly patched, by that time the damage was done. Amongst enthusiasts, the Model I developed a reputation as a “glitchy”, unreliable machine. Radio Shack managed to sell about 1.5 million of them before discontinuing production in favor of their Model II and later, Model III.

A few years later, in January of 1980, as the nation reeled from news that Mo Udall would not seek a second term and prepared for truly contested primaries from both parties, Byte magazine announced in an editorial that, “the era of off-the-shelf personal computers has arrived”.

Whereas before, PCs were seldom, if ever, found in individuals’ homes, they were fast becoming a consumer product, an appliance that more and more everyday Americans would have access to. Though the author of that article admitted that his own PC had cost him $6,000 cash from his local store, he claimed that those costs were “bound to drop as the technology becomes more widely available”. He couldn’t have known how true that prediction would prove to be.

At the time of that article’s publication, aforementioned pioneers like Radio Shack, Commodore, and Xerox manufactured the vast majority of the one half-million microcomputers that existed. As component prices continued to fall, however, many companies entered the computer business. This led to an explosion of low-cost machines known as “home computers” that sold millions of units before the market imploded in a price war in the early 1980s. Below are just a few of the many companies that got in on the “home computer” market.



92jLBh2NwkZjNfOQtZSqWRiPt7jDI1iEFz3EHL8xqWHZKC_wT9SiBzKHBI_UiUJuGYtMpu7locI7Lvs-glRBWDQRdYDgjVUElVrwh7nH9bNpqjh8Tole9vNUTvqK4JQy_gc7RTZMdn3XarU0464zixk

In the late-1970s, Atari was already a household name in the United States. This was due to both their hit arcade games (Pong, Asteroids, Space Invaders, etc.), as well as the hugely successful Atari VCS game console (and its iconic cartridges). Realizing that the VCS (AKA the “Atari 2600”) would have a limited lifetime in the market before a technically advanced competitor came along, Atari decided they would become that competitor, and started work on a new console design that was much more advanced.

Whilst these designs were under development, the “Trinity” machines hit the home PC market, amidst considerable fanfare. Atari thus faced an important decision: should they continue to focus their attention on video game consoles; or should they shift their efforts toward a home computer system instead? In the end, the company decided that the possible reward was worth the risk. They decided to try their hand at designing a PC. Atari did have some advantages over potential competitors in this market.

For one thing, thanks to the rabid success of the 2600, the company had a sound understanding of the home electronics market. Consumers generally wanted high-quality products that would last a long time and were simple and easy to use. The average American had little understanding of how electronics worked, but if the interface was made intuitive enough, then that wouldn’t matter. As a result of these insights, Atari’s first commercial PCs - the Atari 400 and 800, released in 1978 and mass-marketed the following year - were virtually indestructible and just as easy as their consoles to use. The design concept was the same as the 2600 - just plug in a cartridge and go. With a trio of custom graphics and sound co-processors and a 6502 CPU clocked at about 80% faster than most competitors, the Atari machines had capabilities that no other microcomputer could match.

Despite these advantages, however, Atari’s initially strong sales (~600,000 by 1981) slowed once faced with competition from the Commodore 64, which saw release in 1982. Eventually, Atari would retrieve the proverbial toe that it had dipped in the home computer market to redouble its efforts on the video game front. By that time, they were facing increased competition from other firms with other consoles.

1982 would thus prove a pivotal year for Atari.

As soon as the 2600 shipped, work began on its successor. This next-generation console would, in time, come to be labeled the “5200”. Management’s hope was that the company’s recent (if modest) success in the PC market would provide a solid platform for launching their next console. The team responsible for designing the 5200 faced a number of challenges, however.

KxQOZxJI-3xnXe4Q9FjcZl7mH8aNC7ZGhnuXusI-nhhqcYj3yNIwRxyHmV9UuZ_4CE7kh3WGZ6rhI5s0dA0TBWyliVSAwj5L0z8qezlu2ufi4uifpLuzLSdzR4YEdv3dIP7iTLtrjholusIG8GyWpcE

Above: An early prototype of the Atari 5200, its “next-generation” console, after the 2600.​

First, the team had to determine, at a conceptual level, what the 5200 would be. The primary appeal of the console was supposed to be that it would be technically superior to the 2600, boasting better graphics and better performance. However, with how saturated the 2600 market already was, an obvious concern was whether the new 5200 would be backwards compatible. The question provoked fierce debate among the designers.

Proponents argued that consumers would feel “betrayed” and “angry” if their expansive library of 2600 cartridges were suddenly rendered “obsolete and useless” by the next-generation console. Backwards compatibility would also mean that until the 5200 gradually replaced the 2600, Atari could continue to design and release games for the older console.

Those who opposed backward compatibility argued that these supposed strengths were actually weaknesses. Making the 5200 able to play 2600 games would require that hardware components be added to make compatibility with the outdated software possible. This would increase costs that would, in turn, have to be passed along to consumers. Further, management feared that backwards compatibility would discourage coders and third-party game developers from utilizing the new technical capabilities of the next generation console.

In the end, management decided that higher costs (but longer shelf life and happy customers) was a trade-off worth making. The 5200 would be backwards compatible.

Another major challenge came with the limitations imposed by the hardware available in home consoles at the time. For all their bulk, arcade cabinets had the physical space necessary to house enough memory units to facilitate more complex (and engaging) games. Porting these games over to the home consoles often removed features by necessity.

In 1982, Atari received what should have been a golden opportunity. They obtained the rights to develop and publish the console port of Pac-Man, arguably the most popular arcade game of all time. Unfortunately, they were forced to produce two versions (issued on two differently colored cartridges) - one for the 2600 and one for the 5200. The former was far inferior to the latter. To compensate for the lack of ROM space, many visuals were removed, much to the chagrin of fans. The hardware also struggled when multiple ghosts appeared on the screen creating a flickering effect. This version of the game was panned and did not sell well.

The 5200 version was better received and for good reason. Indeed, when the 5200 was officially released in November 1982, Pac-Man 5200 as it was known to critics, was one of the cartridges included with the console. A wise move. Sales of the 5200 increased dramatically.

Under Warner (their parent company) and Atari's chairman and CEO, Raymond Kassar, the company achieved its greatest success, selling millions of 2600s, 5200s, and personal computers. At its peak, Atari accounted for a third of Warner's annual income and was the fastest-growing company in US history at the time. It would, however, have to face an increasingly competitive market and a price-collapse. More on that later.



8gFSzLkQbsNt-H-vHvusODkrK71ho8ggf5QtWrwOIsjHRGMloFxTjSWMR7r4zC2Y1S1kCwNcoGUWwN-NS32vFFbI3qsxH_o7ixbZ_iIF_SOGswzusZtMWtBJApZOU8Mt3HB-0Y8m5aVFaD6qhcjbzk0

Above: The Sinclair ZX Spectrum - Britain’s best-selling computer of the 1980s.​

Sinclair Research Ltd was a British electronics company founded by Sir Clive Sinclair in Cambridge. Sinclair had originally founded Sinclair Radionics, but by 1976 was beginning to lose control over the company and started a new independent venture to pursue projects under his own direction.

Following the commercial success of a kit computer in 1977 aimed at electronics enthusiasts called the MK14, Sinclair Research (then trading as Science of Cambridge) entered the home computer market in 1980 with the ZX80 at £99.95. At the time the ZX80 was the cheapest personal computer for sale in the UK. This was succeeded by the more well-known ZX81 in the following year (sold as the Timex Sinclair 1000 in the United States). The ZX81 was one of the first computers in the UK to be aimed at the general public and was offered for sale via major high street retail channels. It would become a significant success, selling 1.5 million units.
In 1982 the ZX Spectrum was released, later becoming Britain's best selling computer, competing aggressively against Commodore and other brands. Enhanced models in the form of the ZX Spectrum+ and 128 followed; again, to much fanfare. The ZX Spectrum series would sell more than 5 million units. The machine was also widely used as a home gaming platform with more than 3,500 games titles eventually released for it.



3wqPwoCpZMlxdwQclZMYEiTqNEaNocr6R3wlSU9XjScKZKv0b1PzDTnzdOARGaDt1ZKCX1ocYjbVspJIgbrgzxuRMGcOezjk9A2N3d3qWYxCBdff55LEoucYcwdER4sRJunI-Epu7QNlbVpGDvsVLcE
U0uQissd0Wk1sDRAnUbyfNmcxMbuNFnDtyBjlGbHmaj-jevztiqJE0GO4qFsm9Fkb02UYwmepDZt7eCgs7RuzuKxRlY6ZnPxyMZ5dPPb1IKWzSNevQGa1Y48uDcrUrisglLZ4_KL-LUIESgkvtQ5Ibc
Above: The Commodore 64, according to the Guinness Book of World Records, still the highest-selling desktop personal computer model of all time (left); The TI-99/4A, Texas Instruments’ home computer, and the first 16-bit computer to be commercially available (right). Commodore and Texas Instruments engaged in a fierce rivalry and price war throughout the early to mid 1980s.



1A8QVIxxRYfr6cJpRtSy567kL3XB_Loup14_-UpHTxvtRUM80OuYsSR0Ts4tVffZlLb84ckcYueU_AneP3ik9MOxXBT9S3Z5Mk1kBXy3rqX_fPYb1iQyQSsPb3mt7rCXz-eTETwn1eLbMLqSTusC_YY

After moving its corporate headquarters to Bellevue, Washington in January 1979, Microsoft, led by founders and childhood friends Paul Allen and Bill Gates, entered the operating system (OS) business in 1980 with its own version of Unix called Xenix. The company really came into its own (and came to dominate the OS market), however, when IBM licensed MS-DOS from them in early 1981. Other innovations pioneered by the company at the time included the Microsoft Mouse in 1983, as well as a publishing division - Microsoft Press - founded the same year.

Unfortunately, just as the company was really beginning to take off, tragedy struck one of the two men at the heart of the operation. In 1983, Paul Allen was diagnosed with Hodgkin's lymphoma - a type of cancer that affects white blood cells. Allen would later claim, in a memoir he wrote about his time at Microsoft, that his old friend Gates attempted to dilute Allen’s share of the company following his diagnosis. According to Allen, Gates told him that this was because he [Allen] was “not working hard enough for the company”. Allen later invested in low-tech sectors, sports teams, commercial real estate, neuroscience, private space flight, and other ventures unrelated to Microsoft.

Meanwhile, Gates, who was already serving as CEO and Chairman of the Board of the burgeoning enterprise, began to take on an almost exclusively executive/management role, leaving the actual programming and design to others. Gates and Allen’s relationship had been strained for months prior to Allen’s diagnosis, primarily over disagreements about equity in the company and so forth. Gates later admitted that he “regretted” his and Allen’s falling out. The two later reconciled at the end of the decade and resumed their friendship, which lasted until the end of Allen’s life in 2018.



r8YwDyBiEx_JRUVfc7g6mC1fC_Ls2QF2b_wWxXlRAhVJcsvH03bf-Q3gl1z7y4eeFFsdTlmH3rspqPt8CGWE82X76KywtVPaIArgeNZL76tsqPA4BF30uNgLgYe0byMQKJOI2fg37uNd7i67OvuHGGo

Meanwhile, in Kyoto, Japan, key events in the late 1970s and early 80s were shaping the destiny of another iconic video game company - Nintendo. Two of these events occurred in 1979: its American subsidiary - Nintendo America - was opened in New York City; and a new department focused on arcade game development was created. In 1980, one of the first handheld video game systems, the Game & Watch, was created by Gunpei Yokoi from the technology used in portable calculators. It became one of Nintendo's most successful products, with over 43.5 million units sold worldwide during its production period, and for which 59 games were made in total.

Around this time, Nintendo also entered the arcade market with titles like Sheriff, Radar Scope, and most iconically, Popeye. The last of these, helmed by lead-designer Shigeru Miyamoto, would become one of the most popular of the decade.

Starring the iconic “sailor man” from the cartoons, the game made history as one of the first of a new genre - “platformers”. Players control Popeye as he climbs a series of ladders and avoids rolling barrels, with his ultimate objective being to reach Olive Oyl and rescue her from that brute Bluto. Players can also destroy the barrels and make Popeye nigh-invincible for a short time by grabbing a can of spinach, which is placed at a different location in each level. Over 15 million units of the Popeye game would be sold in America; when combined with the home console port - produced for the ColecoVision - that number rises exponentially. Indeed, after the flop of Radar Scope in the US, Popeye can largely be credited with saving the fledgling Nintendo America from ruin.

AgB7aCo2Lb2P1OSL55WxWKasleMXlEhbxRh75dOU-UdDPs6QjnUKZji1O9X9-mcQ3suNs2d2rUh4nrHHh-Ph0cGvfZDntU_27y7AMCcDTE-TXPUAAHQzZAcGGv8sWIJZIwXgHl8b043rYVcUbCY5TCI
1frMLEExvpB5k_jvzukHRLha8v4ZEN0uve0UxMS141sutMJUrrbdAKKoPmrRPU0dJxs8rJ5lanIgj8tkTkLCDxdpyB4z_Rc1pTFVaUsNGsKtjuUeiJHkn7q5MtX3l-zAbEvHv6z6lGOn7TwWQjvyqIU
Above: Promotional poster for Popeye - the first arcade game that would make Nintendo a household name in America (left). The game’s lead-designer was rising star Shigeru Miyamoto. A prototype of the Nintendo Advanced Video System (AVS), built in late 1982 and released in 1983, distributed by Atari in North America (right).

The only downside to Popeye was that it featured licensed characters, rather than originals that would belong to Nintendo exclusively. If the company wanted to continue producing games starring Popeye and his supporting cast, then they would need to keep renegotiating contracts with King Features over the rights. Thus, company management tasked Miyamoto with creating a cast of original characters who could star in Nintendo’s next games. This he would take up with gusto, creating a series of games that would serve as the foundation of Nintendo’s roster moving forward: Donkey Kong in 1982; Mario Brothers in 1983, starring Mario and Luigi Mario, a pair of Italian-American plumbers; and later, The Legend of Zelda in 1986.

All of these and more would eventually be released on Nintendo’s “Advanced Video System”, a redesign of the company’s earlier “Famicom” console, set to be released on the North American market, distributed by Atari, Inc. Though originally plans were made for an advanced 16-bit console that was really more of a home computer-hybrid, these were later scrapped. Nintendo management feared that the keyboard and other accessories would “overwhelm” non-technophiles and “frighten off” the emerging market of “casual” video game fans.

Though Atari and Nintendo would both be severely shaken by the so-called “Video Game Crash” that occurred the following year (caused largely by the glut of low-quality games and an overly saturated market), both companies would, thanks to high-quality control and a loyal customer base at the heart of their operations, emerge on the other side intact. Both would continue to dominate the video game industry until more Japanese companies - Konami & SEGA - and later, tech companies - Microsoft, Sony - entered the game.

Next Time on Blue Skies in Camelot: More US News & Politics from 1982
 
Last edited:
Chapter 158 - Shake it Up: The PC & Video Game Revolutions
lFbNyh5JJ2ELuL-vGH60tGjcVTUiQwWNnvMFR1HJGC5dSC9-ngFzTv4eOz3ZVJTe0B4bT0SoVFLy8PZg0OhqgMYhZkydKRyxGpBX36DZqhfqn6txmjMYMPs-jA29ijspouFp3CWZMDQ_U12fZozJznc
FGz7HKIpBrAgtP51OnG27hg4iRyHCFccDUXHMA-zTGOZLMyCu20yEKprRoyEhEpyJg16rDbp0JjMBBBo7sFN60uZeDa1yGY0fniDw_hYRwYG8G0RbBz2a3YgFEqM4quI6gQ_XeKZqmMObrcCNRQsnXA
Above: The IBM Personal Computer (left); the X-128 Computer along with Xerox executive Steve Jobs (right); in 1982 both would become early icons of the personal computing revolution.

“Shake it up, make a scene
Let them know what you really mean
And dance all night, keep the beat
And don't you worry 'bout two left feet
Just shake it up, oo, oo
Shake it up, oo oo, yeah
Shake it up, oo, oo
Shake it up, oh, yeah”
- “Shake it Up” by the Cars

“I think it's fair to say that personal computers have become the most empowering tool we've ever created. They're tools of communication, they're tools of creativity, and they can be shaped by their user.” - Bill Gates

“Stay hungry, stay foolish.” - Steve Jobs

The personal computing revolution, which had its origins in the microchip developments of the mid to late 1970s, exploded into life in the early 1980s.

The 1977 release of what came to be called the “Trinity” - the Commodore PET 2001, the Xerox X-2, and the Tandy/Radio Shack TRS-80 Model 1 - marked a turning point. Indeed, in the run up to the release of the “Trinity”, several firms were in fierce competition to develop and release the first truly successful commercial PC.

Chuck Peddle - an American electrical engineer at Motorola - designed the Commodore PET (Personal Electronic Transactor) around the MOS 6502 processor, which he had designed. The PET was, in essence, a single-board computer with a simple TTL-based CRT driver circuit driving a small, built-in monochrome monitor with 40×25 character graphics. The processor card, keyboard, monitor and cassette drive were all mounted in a single metal case. In 1982, Byte Magazine referred to the PET design as “the world's first personal computer”.

The PET shipped in two models; the 2001–4 with 4 KB of RAM, and the 2001–8 with 8 KB. The machine also included a built-in Datassette for data storage located on the front of the case, which left little room for the keyboard. The 2001 was announced in June 1977 and the first 100 units were shipped in mid October of that year.

Although the machine was fairly successful, there were frequent complaints about the tiny calculator-like keyboard, often referred to as a "chiclet keyboard" due to the keys' resemblance to the popular gum candy. This was addressed in the upgraded "dash N" and "dash B" versions of the 2001, which put the cassette outside the case, and included a much larger keyboard with a full stroke non-click motion. Internally a newer and simpler motherboard was used, along with an upgrade in memory to 8, 16, or 32 KB, known as the 2001-N-8, 2001-N-16 or 2001-N-32, respectively.

The PET was the least successful of the 1977 Trinity machines, with under 1 million sales.

icXnx6naTY8a9ufrgJlcA2yBlSmq3BhrmApvLEnU7OR5clJYErT5xF-vaAep8-Xzngmrrg6l4VFw8BpTyAnwFQFQRpouYhb_JHOMGWIKd97frDmQKv4kwWD8eQ-vDkzolXGYSvcdqVL0Efrw98lM0Dw

Above: The “Trinity” of early PC units: the Commodore PET 2001 (left), the Xerox X-2 (center), and the Tandy/Radio Shack TRS-80 Model 1 (right).​

Steve Wozniak (AKA “Woz”) developed the X-1 and subsequent X-2 designs based on the earlier “Alto” design, which was in turn developed at Xerox PARC (Palo Alto Research Center) in the early 1970s. The X-2 had color graphics, a full QWERTY keyboard, and internal slots for expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold separately. The high price of the X-2, along with limited retail distribution, caused it to initially lag in sales behind the other Trinity machines. However, in 1979 it surpassed the Commodore PET, receiving a sales boost attributed to the release of the extremely popular VisiCalc spreadsheet which was initially exclusive to the platform. Though it fell back to 4th place after the release of Atari’s popular 8-bit systems, the X-2 maintained “steady” sales growth. This was largely credited to its durability and longevity; it boasted a lifetime that was up to eight years longer than other machines. By 1985, the X-2 had sold more than 2.1 million units. By the time production ceased in 1993, that number had risen to 4 million. Clearly, Wozniak had designed one Hell of a computer.

UiTGrvgmKlS4qH22wQZ2c5TQOADQWE3y2nhP7Z8k-86xrbxjUTZvNAlFZHL1CIZcClfzUqGg2JV_m3IQfp-cmsaUiQwTgU3QWhiJI9dGBa0jrbZhCULHRHQ-0bgYHiZsc0jFebKC8WSESBoA37TUAGQ

Above: Steve Wozniak, lead designer of the “X-2” computer.​

Finally, the Tandy Corporation (better known as Radio Shack) introduced the TRS-80, which would be retroactively known as the Model I as the company expanded the line with more powerful models. The Model I combined the motherboard and keyboard into one unit with a separate black-and-white monitor and power supply. Tandy's more than 3,000 Radio Shack storefronts ensured the computer would have widespread distribution and support (repair, upgrade, training services) that neither Commodore nor Xerox could.

Despite this huge capacity for sales, however, the Model I suffered from myriad technical difficulties. For one thing, it could not meet FCC regulations on radio interference due to its plastic case and exterior cable-design. There were also internal problems. Keystrokes would randomly repeat at times. The earliest versions of the hardware produced bizarre glitches. Though these were promptly patched, by that time the damage was done. Amongst enthusiasts, the Model I developed a reputation as a “glitchy”, unreliable machine. Radio Shack managed to sell about 1.5 million of them before discontinuing production in favor of their Model II and later, Model III.

A few years later, in January of 1980, as the nation reeled from news that Mo Udall would not seek a second term and prepared for truly contested primaries from both parties, Byte magazine announced in an editorial that, “the era of off-the-shelf personal computers has arrived”.

Whereas before, PCs were seldom, if ever, found in individuals’ homes, they were fast becoming a consumer product, an appliance that more and more everyday Americans would have access to. Though the author of that article admitted that his own PC had cost him $6,000 cash from his local store, he claimed that those costs were “bound to drop as the technology becomes more widely available”. He couldn’t have known how true that prediction would prove to be.

At the time of that article’s publication, aforementioned pioneers like Radio Shack, Commodore, and Xerox manufactured the vast majority of the one half-million microcomputers that existed. As component prices continued to fall, however, many companies entered the computer business. This led to an explosion of low-cost machines known as “home computers” that sold millions of units before the market imploded in a price war in the early 1980s. Below are just a few of the many companies that got in on the “home computer” market.



92jLBh2NwkZjNfOQtZSqWRiPt7jDI1iEFz3EHL8xqWHZKC_wT9SiBzKHBI_UiUJuGYtMpu7locI7Lvs-glRBWDQRdYDgjVUElVrwh7nH9bNpqjh8Tole9vNUTvqK4JQy_gc7RTZMdn3XarU0464zixk

In the late-1970s, Atari was already a household name in the United States. This was due to both their hit arcade games (Pong, Asteroids, Space Invaders, etc.), as well as the hugely successful Atari VCS game console (and its iconic cartridges). Realizing that the VCS (AKA the “Atari 2600”) would have a limited lifetime in the market before a technically advanced competitor came along, Atari decided they would become that competitor, and started work on a new console design that was much more advanced.

Whilst these designs were under development, the “Trinity” machines hit the home PC market, amidst considerable fanfare. Atari thus faced an important decision: should they continue to focus their attention on video game consoles; or should they shift their efforts toward a home computer system instead? In the end, the company decided that the possible reward was worth the risk. They decided to try their hand at designing a PC. Atari did have some advantages over potential competitors in this market.

For one thing, thanks to the rabid success of the 2600, the company had a sound understanding of the home electronics market. Consumers generally wanted high-quality products that would last a long time and were simple and easy to use. The average American had little understanding of how electronics worked, but if the interface was made intuitive enough, then that wouldn’t matter. As a result of these insights, Atari’s first commercial PCs - the Atari 400 and 800, released in 1978 and mass-marketed the following year - were virtually indestructible and just as easy as their consoles to use. The design concept was the same as the 2600 - just plug in a cartridge and go. With a trio of custom graphics and sound co-processors and a 6502 CPU clocked at about 80% faster than most competitors, the Atari machines had capabilities that no other microcomputer could match.

Despite these advantages, however, Atari’s initially strong sales (~600,000 by 1981) slowed once faced with competition from the Commodore 64, which saw release in 1982. Eventually, Atari would retrieve the proverbial toe that it had dipped in the home computer market to redouble its efforts on the video game front. By that time, they were facing increased competition from other firms with other consoles.

1982 would thus prove a pivotal year for Atari.

As soon as the 2600 shipped, work began on its successor. This next-generation console would, in time, come to be labeled the “5200”. Management’s hope was that the company’s recent (if modest) success in the PC market would provide a solid platform for launching their next console. The team responsible for designing the 5200 faced a number of challenges, however.

KxQOZxJI-3xnXe4Q9FjcZl7mH8aNC7ZGhnuXusI-nhhqcYj3yNIwRxyHmV9UuZ_4CE7kh3WGZ6rhI5s0dA0TBWyliVSAwj5L0z8qezlu2ufi4uifpLuzLSdzR4YEdv3dIP7iTLtrjholusIG8GyWpcE

Above: An early prototype of the Atari 5200, its “next-generation” console, after the 2600.​

First, the team had to determine, at a conceptual level, what the 5200 would be. The primary appeal of the console was supposed to be that it would be technically superior to the 2600, boasting better graphics and better performance. However, with how saturated the 2600 market already was, an obvious concern was whether the new 5200 would be backwards compatible. The question provoked fierce debate among the designers.

Proponents argued that consumers would feel “betrayed” and “angry” if their expansive library of 2600 cartridges were suddenly rendered “obsolete and useless” by the next-generation console. Backwards compatibility would also mean that until the 5200 gradually replaced the 2600, Atari could continue to design and release games for the older console.

Those who opposed backward compatibility argued that these supposed strengths were actually weaknesses. Making the 5200 able to play 2600 games would require that hardware components be added to make compatibility with the outdated software possible. This would increase costs that would, in turn, have to be passed along to consumers. Further, management feared that backwards compatibility would discourage coders and third-party game developers from utilizing the new technical capabilities of the next generation console.

In the end, management decided that higher costs (but longer shelf life and happy customers) was a trade-off worth making. The 5200 would be backwards compatible.

Another major challenge came with the limitations imposed by the hardware available in home consoles at the time. For all their bulk, arcade cabinets had the physical space necessary to house enough memory units to facilitate more complex (and engaging) games. Porting these games over to the home consoles often removed features by necessity.

In 1982, Atari received what should have been a golden opportunity. They obtained the rights to develop and publish the console port of Pac-Man, arguably the most popular arcade game of all time. Unfortunately, they were forced to produce two versions (issued on two differently colored cartridges) - one for the 2600 and one for the 5200. The former was far inferior to the latter. To compensate for the lack of ROM space, many visuals were removed, much to the chagrin of fans. The hardware also struggled when multiple ghosts appeared on the screen creating a flickering effect. This version of the game was panned and did not sell well.

The 5200 version was better received and for good reason. Indeed, when the 5200 was officially released in November 1982, Pac-Man 5200 as it was known to critics, was one of the cartridges included with the console. A wise move. Sales of the 5200 increased dramatically.

Under Warner (their parent company) and Atari's chairman and CEO, Raymond Kassar, the company achieved its greatest success, selling millions of 2600s, 5200s, and personal computers. At its peak, Atari accounted for a third of Warner's annual income and was the fastest-growing company in US history at the time. It would, however, have to face an increasingly competitive market and a price-collapse. More on that later.



8gFSzLkQbsNt-H-vHvusODkrK71ho8ggf5QtWrwOIsjHRGMloFxTjSWMR7r4zC2Y1S1kCwNcoGUWwN-NS32vFFbI3qsxH_o7ixbZ_iIF_SOGswzusZtMWtBJApZOU8Mt3HB-0Y8m5aVFaD6qhcjbzk0

Above: The Sinclair ZX Spectrum - Britain’s best-selling computer of the 1980s.​

Sinclair Research Ltd was a British electronics company founded by Sir Clive Sinclair in Cambridge. Sinclair had originally founded Sinclair Radionics, but by 1976 was beginning to lose control over the company and started a new independent venture to pursue projects under his own direction.

Following the commercial success of a kit computer in 1977 aimed at electronics enthusiasts called the MK14, Sinclair Research (then trading as Science of Cambridge) entered the home computer market in 1980 with the ZX80 at £99.95. At the time the ZX80 was the cheapest personal computer for sale in the UK. This was succeeded by the more well-known ZX81 in the following year (sold as the Timex Sinclair 1000 in the United States). The ZX81 was one of the first computers in the UK to be aimed at the general public and was offered for sale via major high street retail channels. It would become a significant success, selling 1.5 million units.
In 1982 the ZX Spectrum was released, later becoming Britain's best selling computer, competing aggressively against Commodore and other brands. Enhanced models in the form of the ZX Spectrum+ and 128 followed; again, to much fanfare. The ZX Spectrum series would sell more than 5 million units. The machine was also widely used as a home gaming platform with more than 3,500 games titles eventually released for it.



3wqPwoCpZMlxdwQclZMYEiTqNEaNocr6R3wlSU9XjScKZKv0b1PzDTnzdOARGaDt1ZKCX1ocYjbVspJIgbrgzxuRMGcOezjk9A2N3d3qWYxCBdff55LEoucYcwdER4sRJunI-Epu7QNlbVpGDvsVLcE
U0uQissd0Wk1sDRAnUbyfNmcxMbuNFnDtyBjlGbHmaj-jevztiqJE0GO4qFsm9Fkb02UYwmepDZt7eCgs7RuzuKxRlY6ZnPxyMZ5dPPb1IKWzSNevQGa1Y48uDcrUrisglLZ4_KL-LUIESgkvtQ5Ibc
Above: The Commodore 64, according to the Guinness Book of World Records, still the highest-selling desktop personal computer model of all time (left); The TI-99/4A, Texas Instruments’ home computer, and the first 16-bit computer to be commercially available (right). Commodore and Texas Instruments engaged in a fierce rivalry and price war throughout the early to mid 1980s.



1A8QVIxxRYfr6cJpRtSy567kL3XB_Loup14_-UpHTxvtRUM80OuYsSR0Ts4tVffZlLb84ckcYueU_AneP3ik9MOxXBT9S3Z5Mk1kBXy3rqX_fPYb1iQyQSsPb3mt7rCXz-eTETwn1eLbMLqSTusC_YY

After moving its corporate headquarters to Bellevue, Washington in January 1979, Microsoft, led by founders and childhood friends Paul Allen and Bill Gates, entered the operating system (OS) business in 1980 with its own version of Unix called Xenix. The company really came into its own (and came to dominate the OS market), however, when IBM licensed MS-DOS from them in early 1981. Other innovations pioneered by the company at the time included the Microsoft Mouse in 1983, as well as a publishing division - Microsoft Press - founded the same year.

Unfortunately, just as the company was really beginning to take off, tragedy struck one of the two men at the heart of the operation. In 1983, Paul Allen was diagnosed with Hodgkin's lymphoma - a type of cancer that affects white blood cells. Allen would later claim, in a memoir he wrote about his time at Microsoft, that his old friend Gates attempted to dilute Allen’s share of the company following his diagnosis. According to Allen, Gates told him that this was because he [Allen] was “not working hard enough for the company”. Allen later invested in low-tech sectors, sports teams, commercial real estate, neuroscience, private space flight, and other ventures unrelated to Microsoft.

Meanwhile, Gates, who was already serving as CEO and Chairman of the Board of the burgeoning enterprise, began to take on an almost exclusively executive/management role, leaving the actual programming and design to others. Gates and Allen’s relationship had been strained for months prior to Allen’s diagnosis, primarily over disagreements about equity in the company and so forth. Gates later admitted that he “regretted” his and Allen’s falling out. The two later reconciled at the end of the decade and resumed their friendship, which lasted until the end of Allen’s life in 2018.



r8YwDyBiEx_JRUVfc7g6mC1fC_Ls2QF2b_wWxXlRAhVJcsvH03bf-Q3gl1z7y4eeFFsdTlmH3rspqPt8CGWE82X76KywtVPaIArgeNZL76tsqPA4BF30uNgLgYe0byMQKJOI2fg37uNd7i67OvuHGGo

Meanwhile, in Kyoto, Japan, key events in the late 1970s and early 80s were shaping the destiny of another iconic video game company - Nintendo. Two of these events occurred in 1979: its American subsidiary - Nintendo America - was opened in New York City; and a new department focused on arcade game development was created. In 1980, one of the first handheld video game systems, the Game & Watch, was created by Gunpei Yokoi from the technology used in portable calculators. It became one of Nintendo's most successful products, with over 43.5 million units sold worldwide during its production period, and for which 59 games were made in total.

Around this time, Nintendo also entered the arcade market with titles like Sheriff, Radar Scope, and most iconically, Popeye. The last of these, helmed by lead-designer Shigeru Miyamoto, would become one of the most popular of the decade.

Starring the iconic “sailor man” from the cartoons, the game made history as one of the first of a new genre - “platformers”. Players control Popeye as he climbs a series of ladders and avoids rolling barrels, with his ultimate objective being to reach Olive Oyl and rescue her from that brute Bluto. Players can also destroy the barrels and make Popeye nigh-invincible for a short time by grabbing a can of spinach, which is placed at a different location in each level. Over 15 million units of the Popeye game would be sold in America; when combined with the home console port - produced for the ColecoVision - that number rises exponentially. Indeed, after the flop of Radar Scope in the US, Popeye can largely be credited with saving the fledgling Nintendo America from ruin.

AgB7aCo2Lb2P1OSL55WxWKasleMXlEhbxRh75dOU-UdDPs6QjnUKZji1O9X9-mcQ3suNs2d2rUh4nrHHh-Ph0cGvfZDntU_27y7AMCcDTE-TXPUAAHQzZAcGGv8sWIJZIwXgHl8b043rYVcUbCY5TCI
1frMLEExvpB5k_jvzukHRLha8v4ZEN0uve0UxMS141sutMJUrrbdAKKoPmrRPU0dJxs8rJ5lanIgj8tkTkLCDxdpyB4z_Rc1pTFVaUsNGsKtjuUeiJHkn7q5MtX3l-zAbEvHv6z6lGOn7TwWQjvyqIU
Above: Promotional poster for Popeye - the first arcade game that would make Nintendo a household name in America (left). The game’s lead-designer was rising star Shigeru Miyamoto. A prototype of the Nintendo Advanced Video System (AVS), built in late 1982 and released in 1983, distributed by Atari in North America (right).

The only downside to Popeye was that it featured licensed characters, rather than originals that would belong to Nintendo exclusively. If the company wanted to continue producing games starring Popeye and his supporting cast, then they would need to keep renegotiating contracts with King Features over the rights. Thus, company management tasked Miyamoto with creating a cast of original characters who could star in Nintendo’s next games. This he would take up with gusto, creating a series of games that would serve as the foundation of Nintendo’s roster moving forward: Donkey Kong in 1982; Mario Brothers in 1983, starring Mario and Luigi Mario, a pair of Italian-American plumbers; and later, The Legend of Zelda in 1986.

All of these and more would eventually be released on Nintendo’s “Advanced Video System”, a redesign of the company’s earlier “Famicom” console, set to be released on the North American market, distributed by Atari, Inc. Though originally plans were made for an advanced 16-bit console that was really more of a home computer-hybrid, these were later scrapped. Nintendo management feared that the keyboard and other accessories would “overwhelm” non-technophiles and “frighten off” the emerging market of “casual” video game fans. Thus, the decision was made to stick to the more familiar, 8-bit, gamepad controller setup, with game cartridges, rather than the more advanced CD-ROMs, which had been experimented with on the Japanese market.

Though Atari and Nintendo would both be severely shaken by the so-called “Video Game Crash” that occurred the following year (caused largely by the glut of low-quality games and an overly saturated market), both companies would, thanks to high-quality control and a loyal customer base at the heart of their operations, emerge on the other side intact. Both would continue to dominate the video game industry until more Japanese companies - Konami & SEGA - and later, tech companies - Microsoft, Sony - entered the game.

Next Time on Blue Skies in Camelot: More US News & Politics from 1982
Was not expecting another update so soon after the last one but this is a good surprise and I liked the chapter Mr. President. Can't wait to read about how US news and politics are going.
 
Holy crap, it's here! I've been really excited for this update, and to catch it at the first minute!?
Steve Wozniak (AKA “Woz”) developed the X-1 and subsequent X-2 designs based on the earlier “Alto” design, which was in turn developed at Xerox PARC (Palo Alto Research Center) in the early 1970s. The X-2 had color graphics, a full QWERTY keyboard, and internal slots for expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold separately. The high price of the X-2, along with limited retail distribution, caused it to initially lag in sales behind the other Trinity machines. However, in 1979 it surpassed the Commodore PET, receiving a sales boost attributed to the release of the extremely popular VisiCalc spreadsheet which was initially exclusive to the platform. Though it fell back to 4th place after the release of Atari’s popular 8-bit systems, the X-2 maintained “steady” sales growth. This was largely credited to its durability and longevity; it boasted a lifetime that was up to eight years longer than other machines. By 1985, the X-2 had sold more than 2.1 million units. By the time production ceased in 1993, that number had risen to 4 million. Clearly, Wozniak had designed one Hell of a computer.
Honestly, Steve Jobs and Steve Wozniak not forming their own company, but instead getting jobs at Xerox and just spearheading some efforts in one of their division is probably better for them. Especially Jobs, who was famously quite arrogant. He wouldn't get ousted from his position as CEO by the other executives and found NeXT since... well, he isn't CEO of Xerox!
The 5200 version was better received and for good reason. Indeed, when the 5200 was officially released in November 1982, Pac-Man 5200 as it was known to critics, was one of the cartridges included with the console. A wise move. Sales of the 5200 increased dramatically.

Under Warner (their parent company) and Atari's chairman and CEO, Raymond Kassar, the company achieved its greatest success, selling millions of 2600s, 5200s, and personal computers. At its peak, Atari accounted for a third of Warner's annual income and was the fastest-growing company in US history at the time. It would, however, have to face an increasingly competitive market and a price-collapse. More on that later.
Nice to see Atari got their act together regarding the 5200, as well as their Pac-Man ports.
Starring the iconic “sailor man” from the cartoons, the game made history as one of the first of a new genre - “platformers”. Players control Popeye as he climbs a series of ladders and avoids rolling barrels, with his ultimate objective being to reach Olive Oyl and rescue her from that brute Bluto. Players can also destroy the barrels and make Popeye nigh-invincible for a short time by grabbing a can of spinach, which is placed at a different location in each level. Over 15 million units of the Popeye game would be sold in America; when combined with the home console port - produced for the ColecoVision - that number rises exponentially. Indeed, after the flop of Radar Scope in the US, Popeye can largely be credited with saving the fledgling Nintendo America from ruin.
The only downside to Popeye was that it featured licensed characters, rather than originals that would belong to Nintendo exclusively. If the company wanted to continue producing games starring Popeye and his supporting cast, then they would need to keep renegotiating contracts with King Features over the rights. Thus, company management tasked Miyamoto with creating a cast of original characters who could star in Nintendo’s next games. This he would take up with gusto, creating a series of games that would serve as the foundation of Nintendo’s roster moving forward: Donkey Kong in 1982; Mario Brothers in 1983, starring Mario and Luigi Mario, a pair of Italian-American plumbers; and later, The Legend of Zelda in 1986.
Hoo boy, I almost thought that you butterflied the existence of Mario entirely! Glad to see the portly plumber still exists, alongside his brother and Zelda. What's this timeline's Donkey Kong like, though? Is it like Donkey Kong Jr. but you play as Mario/Jumpman instead (kinda like the first Mario vs. Donkey Kong game on GBA which got a remake on the Switch recently,) or something else entirely? Do we still get the original, arcade version of Punch-Out!! in 1984?
All of these and more would eventually be released on Nintendo’s “Advanced Video System”, a redesign of the company’s earlier “Famicom” console, set to be released on the North American market, distributed by Atari, Inc. Though originally plans were made for an advanced 16-bit console that was really more of a home computer-hybrid, these were later scrapped. Nintendo management feared that the keyboard and other accessories would “overwhelm” non-technophiles and “frighten off” the emerging market of “casual” video game fans. Thus, the decision was made to stick to the more familiar, 8-bit, gamepad controller setup, with game cartridges, rather than the more advanced CD-ROMs, which had been experimented with on the Japanese market.
A-ha, I knew it! The Atari-Nintendo partnership is a go! Though, the Colecovision port of Popeye existing too kinda contradicts that. OTL, part of the reason why the Atari-Nintendo deal never went through was because Nintendo was making a port of Donkey Kong for the Colecovision first before the Atari 2600. I feel like it would make more sense for the Atari 5200 to get first dibs on a home console port of Popeye here so that the Atari-Nintendo deal can still happen. Maybe it's even the first step towards it! I also do wonder how the AVS will affect the sales of the 5200, especially since it releases only a year after the 5200 in 1983...

In addition, I'm a bit confused by the mention of CD-ROM's here; the first console add-on to support CD's in OTL was the PC-Engine/Turbo-Grafx 16 in 1988. I assume this was meant to be a reference to the Famicom Disk System, which uses a derivative of the QuickDisk format (and is a magnetic storage medium similar to a floppy disk)?
Though Atari and Nintendo would both be severely shaken by the so-called “Video Game Crash” that occurred the following year (caused largely by the glut of low-quality games and an overly saturated market), both companies would, thanks to high-quality control and a loyal customer base at the heart of their operations, emerge on the other side intact. Both would continue to dominate the video game industry until more Japanese companies - Konami & SEGA - and later, tech companies - Microsoft, Sony - entered the game.
Welp, it's no surprise that the crash still happened, but it does seem like the recovery is gonna be a lot quicker than in OTL. Bringing up Konami in the way you did is making me very curious as to what they're gonna get up to.

Overall, I loved this update as a whole. It's some really great stuff, Mr. President!
 
Holy crap, it's here! I've been really excited for this update, and to catch it at the first minute!?

Honestly, Steve Jobs and Steve Wozniak not forming their own company, but instead getting jobs at Xerox and just spearheading some efforts in one of their division is probably better for them. Especially Jobs, who was famously quite arrogant. He wouldn't get ousted from his position as CEO by the other executives and found NeXT since... well, he isn't CEO of Xerox!

Nice to see Atari got their act together regarding the 5200, as well as their Pac-Man ports.


Hoo boy, I almost thought that you butterflied the existence of Mario entirely! Glad to see the portly plumber still exists, alongside his brother and Zelda. What's this timeline's Donkey Kong like, though? Is it like Donkey Kong Jr. but you play as Mario/Jumpman instead (kinda like the first Mario vs. Donkey Kong game on GBA which got a remake on the Switch recently,) or something else entirely? Do we still get the original, arcade version of Punch-Out!! in 1984?

A-ha, I knew it! The Atari-Nintendo partnership is a go! Though, the Colecovision port of Popeye existing too kinda contradicts that. OTL, part of the reason why the Atari-Nintendo deal never went through was because Nintendo was making a port of Donkey Kong for the Colecovision first before the Atari 2600. I feel like it would make more sense for the Atari 5200 to get first dibs on a home console port of Popeye here so that the Atari-Nintendo deal can still happen. Maybe it's even the first step towards it! I also do wonder how the AVS will affect the sales of the 5200, especially since it releases only a year after the 5200 in 1983...

In addition, I'm a bit confused by the mention of CD-ROM's here; the first console add-on to support CD's in OTL was the PC-Engine/Turbo-Grafx 16 in 1988. I assume this was meant to be a reference to the Famicom Disk System, which uses a derivative of the QuickDisk format (and is a magnetic storage medium similar to a floppy disk)?

Welp, it's no surprise that the crash still happened, but it does seem like the recovery is gonna be a lot quicker than in OTL. Bringing up Konami in the way you did is making me very curious as to what they're gonna get up to.

Overall, I loved this update as a whole. It's some really great stuff, Mr. President!
Thank you! :) Glad you enjoyed it. My apologies for the mistake. I will fix the error vis a vis the CD-ROM.
 
Chapter 158 - Shake it Up: The PC & Video Game Revolutions
lFbNyh5JJ2ELuL-vGH60tGjcVTUiQwWNnvMFR1HJGC5dSC9-ngFzTv4eOz3ZVJTe0B4bT0SoVFLy8PZg0OhqgMYhZkydKRyxGpBX36DZqhfqn6txmjMYMPs-jA29ijspouFp3CWZMDQ_U12fZozJznc
FGz7HKIpBrAgtP51OnG27hg4iRyHCFccDUXHMA-zTGOZLMyCu20yEKprRoyEhEpyJg16rDbp0JjMBBBo7sFN60uZeDa1yGY0fniDw_hYRwYG8G0RbBz2a3YgFEqM4quI6gQ_XeKZqmMObrcCNRQsnXA
Above: The IBM Personal Computer (left); the X-128 Computer along with Xerox executive Steve Jobs (right); in 1982 both would become early icons of the personal computing revolution.

“Shake it up, make a scene
Let them know what you really mean
And dance all night, keep the beat
And don't you worry 'bout two left feet
Just shake it up, oo, oo
Shake it up, oo oo, yeah
Shake it up, oo, oo
Shake it up, oh, yeah”
- “Shake it Up” by the Cars

“I think it's fair to say that personal computers have become the most empowering tool we've ever created. They're tools of communication, they're tools of creativity, and they can be shaped by their user.” - Bill Gates

“Stay hungry, stay foolish.” - Steve Jobs

The personal computing revolution, which had its origins in the microchip developments of the mid to late 1970s, exploded into life in the early 1980s.

The 1977 release of what came to be called the “Trinity” - the Commodore PET 2001, the Xerox X-2, and the Tandy/Radio Shack TRS-80 Model 1 - marked a turning point. Indeed, in the run up to the release of the “Trinity”, several firms were in fierce competition to develop and release the first truly successful commercial PC.

Chuck Peddle - an American electrical engineer at Motorola - designed the Commodore PET (Personal Electronic Transactor) around the MOS 6502 processor, which he had designed. The PET was, in essence, a single-board computer with a simple TTL-based CRT driver circuit driving a small, built-in monochrome monitor with 40×25 character graphics. The processor card, keyboard, monitor and cassette drive were all mounted in a single metal case. In 1982, Byte Magazine referred to the PET design as “the world's first personal computer”.

The PET shipped in two models; the 2001–4 with 4 KB of RAM, and the 2001–8 with 8 KB. The machine also included a built-in Datassette for data storage located on the front of the case, which left little room for the keyboard. The 2001 was announced in June 1977 and the first 100 units were shipped in mid October of that year.

Although the machine was fairly successful, there were frequent complaints about the tiny calculator-like keyboard, often referred to as a "chiclet keyboard" due to the keys' resemblance to the popular gum candy. This was addressed in the upgraded "dash N" and "dash B" versions of the 2001, which put the cassette outside the case, and included a much larger keyboard with a full stroke non-click motion. Internally a newer and simpler motherboard was used, along with an upgrade in memory to 8, 16, or 32 KB, known as the 2001-N-8, 2001-N-16 or 2001-N-32, respectively.

The PET was the least successful of the 1977 Trinity machines, with under 1 million sales.

icXnx6naTY8a9ufrgJlcA2yBlSmq3BhrmApvLEnU7OR5clJYErT5xF-vaAep8-Xzngmrrg6l4VFw8BpTyAnwFQFQRpouYhb_JHOMGWIKd97frDmQKv4kwWD8eQ-vDkzolXGYSvcdqVL0Efrw98lM0Dw

Above: The “Trinity” of early PC units: the Commodore PET 2001 (left), the Xerox X-2 (center), and the Tandy/Radio Shack TRS-80 Model 1 (right).​

Steve Wozniak (AKA “Woz”) developed the X-1 and subsequent X-2 designs based on the earlier “Alto” design, which was in turn developed at Xerox PARC (Palo Alto Research Center) in the early 1970s. The X-2 had color graphics, a full QWERTY keyboard, and internal slots for expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold separately. The high price of the X-2, along with limited retail distribution, caused it to initially lag in sales behind the other Trinity machines. However, in 1979 it surpassed the Commodore PET, receiving a sales boost attributed to the release of the extremely popular VisiCalc spreadsheet which was initially exclusive to the platform. Though it fell back to 4th place after the release of Atari’s popular 8-bit systems, the X-2 maintained “steady” sales growth. This was largely credited to its durability and longevity; it boasted a lifetime that was up to eight years longer than other machines. By 1985, the X-2 had sold more than 2.1 million units. By the time production ceased in 1993, that number had risen to 4 million. Clearly, Wozniak had designed one Hell of a computer.

UiTGrvgmKlS4qH22wQZ2c5TQOADQWE3y2nhP7Z8k-86xrbxjUTZvNAlFZHL1CIZcClfzUqGg2JV_m3IQfp-cmsaUiQwTgU3QWhiJI9dGBa0jrbZhCULHRHQ-0bgYHiZsc0jFebKC8WSESBoA37TUAGQ

Above: Steve Wozniak, lead designer of the “X-2” computer.​

Finally, the Tandy Corporation (better known as Radio Shack) introduced the TRS-80, which would be retroactively known as the Model I as the company expanded the line with more powerful models. The Model I combined the motherboard and keyboard into one unit with a separate black-and-white monitor and power supply. Tandy's more than 3,000 Radio Shack storefronts ensured the computer would have widespread distribution and support (repair, upgrade, training services) that neither Commodore nor Xerox could.

Despite this huge capacity for sales, however, the Model I suffered from myriad technical difficulties. For one thing, it could not meet FCC regulations on radio interference due to its plastic case and exterior cable-design. There were also internal problems. Keystrokes would randomly repeat at times. The earliest versions of the hardware produced bizarre glitches. Though these were promptly patched, by that time the damage was done. Amongst enthusiasts, the Model I developed a reputation as a “glitchy”, unreliable machine. Radio Shack managed to sell about 1.5 million of them before discontinuing production in favor of their Model II and later, Model III.

A few years later, in January of 1980, as the nation reeled from news that Mo Udall would not seek a second term and prepared for truly contested primaries from both parties, Byte magazine announced in an editorial that, “the era of off-the-shelf personal computers has arrived”.

Whereas before, PCs were seldom, if ever, found in individuals’ homes, they were fast becoming a consumer product, an appliance that more and more everyday Americans would have access to. Though the author of that article admitted that his own PC had cost him $6,000 cash from his local store, he claimed that those costs were “bound to drop as the technology becomes more widely available”. He couldn’t have known how true that prediction would prove to be.

At the time of that article’s publication, aforementioned pioneers like Radio Shack, Commodore, and Xerox manufactured the vast majority of the one half-million microcomputers that existed. As component prices continued to fall, however, many companies entered the computer business. This led to an explosion of low-cost machines known as “home computers” that sold millions of units before the market imploded in a price war in the early 1980s. Below are just a few of the many companies that got in on the “home computer” market.



92jLBh2NwkZjNfOQtZSqWRiPt7jDI1iEFz3EHL8xqWHZKC_wT9SiBzKHBI_UiUJuGYtMpu7locI7Lvs-glRBWDQRdYDgjVUElVrwh7nH9bNpqjh8Tole9vNUTvqK4JQy_gc7RTZMdn3XarU0464zixk

In the late-1970s, Atari was already a household name in the United States. This was due to both their hit arcade games (Pong, Asteroids, Space Invaders, etc.), as well as the hugely successful Atari VCS game console (and its iconic cartridges). Realizing that the VCS (AKA the “Atari 2600”) would have a limited lifetime in the market before a technically advanced competitor came along, Atari decided they would become that competitor, and started work on a new console design that was much more advanced.

Whilst these designs were under development, the “Trinity” machines hit the home PC market, amidst considerable fanfare. Atari thus faced an important decision: should they continue to focus their attention on video game consoles; or should they shift their efforts toward a home computer system instead? In the end, the company decided that the possible reward was worth the risk. They decided to try their hand at designing a PC. Atari did have some advantages over potential competitors in this market.

For one thing, thanks to the rabid success of the 2600, the company had a sound understanding of the home electronics market. Consumers generally wanted high-quality products that would last a long time and were simple and easy to use. The average American had little understanding of how electronics worked, but if the interface was made intuitive enough, then that wouldn’t matter. As a result of these insights, Atari’s first commercial PCs - the Atari 400 and 800, released in 1978 and mass-marketed the following year - were virtually indestructible and just as easy as their consoles to use. The design concept was the same as the 2600 - just plug in a cartridge and go. With a trio of custom graphics and sound co-processors and a 6502 CPU clocked at about 80% faster than most competitors, the Atari machines had capabilities that no other microcomputer could match.

Despite these advantages, however, Atari’s initially strong sales (~600,000 by 1981) slowed once faced with competition from the Commodore 64, which saw release in 1982. Eventually, Atari would retrieve the proverbial toe that it had dipped in the home computer market to redouble its efforts on the video game front. By that time, they were facing increased competition from other firms with other consoles.

1982 would thus prove a pivotal year for Atari.

As soon as the 2600 shipped, work began on its successor. This next-generation console would, in time, come to be labeled the “5200”. Management’s hope was that the company’s recent (if modest) success in the PC market would provide a solid platform for launching their next console. The team responsible for designing the 5200 faced a number of challenges, however.

KxQOZxJI-3xnXe4Q9FjcZl7mH8aNC7ZGhnuXusI-nhhqcYj3yNIwRxyHmV9UuZ_4CE7kh3WGZ6rhI5s0dA0TBWyliVSAwj5L0z8qezlu2ufi4uifpLuzLSdzR4YEdv3dIP7iTLtrjholusIG8GyWpcE

Above: An early prototype of the Atari 5200, its “next-generation” console, after the 2600.​

First, the team had to determine, at a conceptual level, what the 5200 would be. The primary appeal of the console was supposed to be that it would be technically superior to the 2600, boasting better graphics and better performance. However, with how saturated the 2600 market already was, an obvious concern was whether the new 5200 would be backwards compatible. The question provoked fierce debate among the designers.

Proponents argued that consumers would feel “betrayed” and “angry” if their expansive library of 2600 cartridges were suddenly rendered “obsolete and useless” by the next-generation console. Backwards compatibility would also mean that until the 5200 gradually replaced the 2600, Atari could continue to design and release games for the older console.

Those who opposed backward compatibility argued that these supposed strengths were actually weaknesses. Making the 5200 able to play 2600 games would require that hardware components be added to make compatibility with the outdated software possible. This would increase costs that would, in turn, have to be passed along to consumers. Further, management feared that backwards compatibility would discourage coders and third-party game developers from utilizing the new technical capabilities of the next generation console.

In the end, management decided that higher costs (but longer shelf life and happy customers) was a trade-off worth making. The 5200 would be backwards compatible.

Another major challenge came with the limitations imposed by the hardware available in home consoles at the time. For all their bulk, arcade cabinets had the physical space necessary to house enough memory units to facilitate more complex (and engaging) games. Porting these games over to the home consoles often removed features by necessity.

In 1982, Atari received what should have been a golden opportunity. They obtained the rights to develop and publish the console port of Pac-Man, arguably the most popular arcade game of all time. Unfortunately, they were forced to produce two versions (issued on two differently colored cartridges) - one for the 2600 and one for the 5200. The former was far inferior to the latter. To compensate for the lack of ROM space, many visuals were removed, much to the chagrin of fans. The hardware also struggled when multiple ghosts appeared on the screen creating a flickering effect. This version of the game was panned and did not sell well.

The 5200 version was better received and for good reason. Indeed, when the 5200 was officially released in November 1982, Pac-Man 5200 as it was known to critics, was one of the cartridges included with the console. A wise move. Sales of the 5200 increased dramatically.

Under Warner (their parent company) and Atari's chairman and CEO, Raymond Kassar, the company achieved its greatest success, selling millions of 2600s, 5200s, and personal computers. At its peak, Atari accounted for a third of Warner's annual income and was the fastest-growing company in US history at the time. It would, however, have to face an increasingly competitive market and a price-collapse. More on that later.



8gFSzLkQbsNt-H-vHvusODkrK71ho8ggf5QtWrwOIsjHRGMloFxTjSWMR7r4zC2Y1S1kCwNcoGUWwN-NS32vFFbI3qsxH_o7ixbZ_iIF_SOGswzusZtMWtBJApZOU8Mt3HB-0Y8m5aVFaD6qhcjbzk0

Above: The Sinclair ZX Spectrum - Britain’s best-selling computer of the 1980s.​

Sinclair Research Ltd was a British electronics company founded by Sir Clive Sinclair in Cambridge. Sinclair had originally founded Sinclair Radionics, but by 1976 was beginning to lose control over the company and started a new independent venture to pursue projects under his own direction.

Following the commercial success of a kit computer in 1977 aimed at electronics enthusiasts called the MK14, Sinclair Research (then trading as Science of Cambridge) entered the home computer market in 1980 with the ZX80 at £99.95. At the time the ZX80 was the cheapest personal computer for sale in the UK. This was succeeded by the more well-known ZX81 in the following year (sold as the Timex Sinclair 1000 in the United States). The ZX81 was one of the first computers in the UK to be aimed at the general public and was offered for sale via major high street retail channels. It would become a significant success, selling 1.5 million units.
In 1982 the ZX Spectrum was released, later becoming Britain's best selling computer, competing aggressively against Commodore and other brands. Enhanced models in the form of the ZX Spectrum+ and 128 followed; again, to much fanfare. The ZX Spectrum series would sell more than 5 million units. The machine was also widely used as a home gaming platform with more than 3,500 games titles eventually released for it.



3wqPwoCpZMlxdwQclZMYEiTqNEaNocr6R3wlSU9XjScKZKv0b1PzDTnzdOARGaDt1ZKCX1ocYjbVspJIgbrgzxuRMGcOezjk9A2N3d3qWYxCBdff55LEoucYcwdER4sRJunI-Epu7QNlbVpGDvsVLcE
U0uQissd0Wk1sDRAnUbyfNmcxMbuNFnDtyBjlGbHmaj-jevztiqJE0GO4qFsm9Fkb02UYwmepDZt7eCgs7RuzuKxRlY6ZnPxyMZ5dPPb1IKWzSNevQGa1Y48uDcrUrisglLZ4_KL-LUIESgkvtQ5Ibc
Above: The Commodore 64, according to the Guinness Book of World Records, still the highest-selling desktop personal computer model of all time (left); The TI-99/4A, Texas Instruments’ home computer, and the first 16-bit computer to be commercially available (right). Commodore and Texas Instruments engaged in a fierce rivalry and price war throughout the early to mid 1980s.



1A8QVIxxRYfr6cJpRtSy567kL3XB_Loup14_-UpHTxvtRUM80OuYsSR0Ts4tVffZlLb84ckcYueU_AneP3ik9MOxXBT9S3Z5Mk1kBXy3rqX_fPYb1iQyQSsPb3mt7rCXz-eTETwn1eLbMLqSTusC_YY

After moving its corporate headquarters to Bellevue, Washington in January 1979, Microsoft, led by founders and childhood friends Paul Allen and Bill Gates, entered the operating system (OS) business in 1980 with its own version of Unix called Xenix. The company really came into its own (and came to dominate the OS market), however, when IBM licensed MS-DOS from them in early 1981. Other innovations pioneered by the company at the time included the Microsoft Mouse in 1983, as well as a publishing division - Microsoft Press - founded the same year.

Unfortunately, just as the company was really beginning to take off, tragedy struck one of the two men at the heart of the operation. In 1983, Paul Allen was diagnosed with Hodgkin's lymphoma - a type of cancer that affects white blood cells. Allen would later claim, in a memoir he wrote about his time at Microsoft, that his old friend Gates attempted to dilute Allen’s share of the company following his diagnosis. According to Allen, Gates told him that this was because he [Allen] was “not working hard enough for the company”. Allen later invested in low-tech sectors, sports teams, commercial real estate, neuroscience, private space flight, and other ventures unrelated to Microsoft.

Meanwhile, Gates, who was already serving as CEO and Chairman of the Board of the burgeoning enterprise, began to take on an almost exclusively executive/management role, leaving the actual programming and design to others. Gates and Allen’s relationship had been strained for months prior to Allen’s diagnosis, primarily over disagreements about equity in the company and so forth. Gates later admitted that he “regretted” his and Allen’s falling out. The two later reconciled at the end of the decade and resumed their friendship, which lasted until the end of Allen’s life in 2018.



r8YwDyBiEx_JRUVfc7g6mC1fC_Ls2QF2b_wWxXlRAhVJcsvH03bf-Q3gl1z7y4eeFFsdTlmH3rspqPt8CGWE82X76KywtVPaIArgeNZL76tsqPA4BF30uNgLgYe0byMQKJOI2fg37uNd7i67OvuHGGo

Meanwhile, in Kyoto, Japan, key events in the late 1970s and early 80s were shaping the destiny of another iconic video game company - Nintendo. Two of these events occurred in 1979: its American subsidiary - Nintendo America - was opened in New York City; and a new department focused on arcade game development was created. In 1980, one of the first handheld video game systems, the Game & Watch, was created by Gunpei Yokoi from the technology used in portable calculators. It became one of Nintendo's most successful products, with over 43.5 million units sold worldwide during its production period, and for which 59 games were made in total.

Around this time, Nintendo also entered the arcade market with titles like Sheriff, Radar Scope, and most iconically, Popeye. The last of these, helmed by lead-designer Shigeru Miyamoto, would become one of the most popular of the decade.

Starring the iconic “sailor man” from the cartoons, the game made history as one of the first of a new genre - “platformers”. Players control Popeye as he climbs a series of ladders and avoids rolling barrels, with his ultimate objective being to reach Olive Oyl and rescue her from that brute Bluto. Players can also destroy the barrels and make Popeye nigh-invincible for a short time by grabbing a can of spinach, which is placed at a different location in each level. Over 15 million units of the Popeye game would be sold in America; when combined with the home console port - produced for the ColecoVision - that number rises exponentially. Indeed, after the flop of Radar Scope in the US, Popeye can largely be credited with saving the fledgling Nintendo America from ruin.

AgB7aCo2Lb2P1OSL55WxWKasleMXlEhbxRh75dOU-UdDPs6QjnUKZji1O9X9-mcQ3suNs2d2rUh4nrHHh-Ph0cGvfZDntU_27y7AMCcDTE-TXPUAAHQzZAcGGv8sWIJZIwXgHl8b043rYVcUbCY5TCI
1frMLEExvpB5k_jvzukHRLha8v4ZEN0uve0UxMS141sutMJUrrbdAKKoPmrRPU0dJxs8rJ5lanIgj8tkTkLCDxdpyB4z_Rc1pTFVaUsNGsKtjuUeiJHkn7q5MtX3l-zAbEvHv6z6lGOn7TwWQjvyqIU
Above: Promotional poster for Popeye - the first arcade game that would make Nintendo a household name in America (left). The game’s lead-designer was rising star Shigeru Miyamoto. A prototype of the Nintendo Advanced Video System (AVS), built in late 1982 and released in 1983, distributed by Atari in North America (right).

The only downside to Popeye was that it featured licensed characters, rather than originals that would belong to Nintendo exclusively. If the company wanted to continue producing games starring Popeye and his supporting cast, then they would need to keep renegotiating contracts with King Features over the rights. Thus, company management tasked Miyamoto with creating a cast of original characters who could star in Nintendo’s next games. This he would take up with gusto, creating a series of games that would serve as the foundation of Nintendo’s roster moving forward: Donkey Kong in 1982; Mario Brothers in 1983, starring Mario and Luigi Mario, a pair of Italian-American plumbers; and later, The Legend of Zelda in 1986.

All of these and more would eventually be released on Nintendo’s “Advanced Video System”, a redesign of the company’s earlier “Famicom” console, set to be released on the North American market, distributed by Atari, Inc. Though originally plans were made for an advanced 16-bit console that was really more of a home computer-hybrid, these were later scrapped. Nintendo management feared that the keyboard and other accessories would “overwhelm” non-technophiles and “frighten off” the emerging market of “casual” video game fans.

Though Atari and Nintendo would both be severely shaken by the so-called “Video Game Crash” that occurred the following year (caused largely by the glut of low-quality games and an overly saturated market), both companies would, thanks to high-quality control and a loyal customer base at the heart of their operations, emerge on the other side intact. Both would continue to dominate the video game industry until more Japanese companies - Konami & SEGA - and later, tech companies - Microsoft, Sony - entered the game.

Next Time on Blue Skies in Camelot: More US News & Politics from 1982
Excellent update, Mr. President. I must admit, I thought that the latest update would have been posted next week. It's a surprise, but a welcome surprise. Keep up the good work, sir.
 
Mr. President, returning to Denis Healey, given that he's taking OTL Margaret Thatcher's role in international fame and popular culture in ATL, I have created several possible nicknames for Healey, something that became famous throughout the world in that timeline, like how Thatcher became known internationally as the "Iron Lady" :


- The Lion of Leeds
- The Iron Eyebrows
- The Velvet Fist

Of course, if you, sir, or other commenters could create better nicknames, feel free to add them :) .
 
Mr. President, returning to Denis Healey, given that he's taking OTL Margaret Thatcher's role in international fame and popular culture in ATL, I have created several possible nicknames for Healey, something that became famous throughout the world in that timeline, like how Thatcher became known internationally as the "Iron Lady" :


- The Lion of Leeds
- The Iron Eyebrows
- The Velvet Fist

Of course, if you, sir, or other commenters could create better nicknames, feel free to add them :) .
Yes, I reckon the British newspapers will delight in all sorts of nicknames, including maybe: "The Right Honourable Hedge".

Although, a thought did just occur to me. Without Margaret Thatcher as PM (by which she was completely the most prominent character in the show), could 1984's Spitting Image survive for long? Or would the show still continue ITTL, but not really be as memorable, given that the show in OTL had an anti-Thatcher stance since many of the staff on the show were prominently left-wing and sought to remake her every time to make her look more evil.
If the show does continue, it'd be quite a treat to see what they would do with the Kennedys!
 
Last edited:
Chapter 158 - Shake it Up: The PC & Video Game Revolutions
lFbNyh5JJ2ELuL-vGH60tGjcVTUiQwWNnvMFR1HJGC5dSC9-ngFzTv4eOz3ZVJTe0B4bT0SoVFLy8PZg0OhqgMYhZkydKRyxGpBX36DZqhfqn6txmjMYMPs-jA29ijspouFp3CWZMDQ_U12fZozJznc
FGz7HKIpBrAgtP51OnG27hg4iRyHCFccDUXHMA-zTGOZLMyCu20yEKprRoyEhEpyJg16rDbp0JjMBBBo7sFN60uZeDa1yGY0fniDw_hYRwYG8G0RbBz2a3YgFEqM4quI6gQ_XeKZqmMObrcCNRQsnXA
Above: The IBM Personal Computer (left); the X-128 Computer along with Xerox executive Steve Jobs (right); in 1982 both would become early icons of the personal computing revolution.

“Shake it up, make a scene
Let them know what you really mean
And dance all night, keep the beat
And don't you worry 'bout two left feet
Just shake it up, oo, oo
Shake it up, oo oo, yeah
Shake it up, oo, oo
Shake it up, oh, yeah”
- “Shake it Up” by the Cars

“I think it's fair to say that personal computers have become the most empowering tool we've ever created. They're tools of communication, they're tools of creativity, and they can be shaped by their user.” - Bill Gates

“Stay hungry, stay foolish.” - Steve Jobs

The personal computing revolution, which had its origins in the microchip developments of the mid to late 1970s, exploded into life in the early 1980s.

The 1977 release of what came to be called the “Trinity” - the Commodore PET 2001, the Xerox X-2, and the Tandy/Radio Shack TRS-80 Model 1 - marked a turning point. Indeed, in the run up to the release of the “Trinity”, several firms were in fierce competition to develop and release the first truly successful commercial PC.

Chuck Peddle - an American electrical engineer at Motorola - designed the Commodore PET (Personal Electronic Transactor) around the MOS 6502 processor, which he had designed. The PET was, in essence, a single-board computer with a simple TTL-based CRT driver circuit driving a small, built-in monochrome monitor with 40×25 character graphics. The processor card, keyboard, monitor and cassette drive were all mounted in a single metal case. In 1982, Byte Magazine referred to the PET design as “the world's first personal computer”.

The PET shipped in two models; the 2001–4 with 4 KB of RAM, and the 2001–8 with 8 KB. The machine also included a built-in Datassette for data storage located on the front of the case, which left little room for the keyboard. The 2001 was announced in June 1977 and the first 100 units were shipped in mid October of that year.

Although the machine was fairly successful, there were frequent complaints about the tiny calculator-like keyboard, often referred to as a "chiclet keyboard" due to the keys' resemblance to the popular gum candy. This was addressed in the upgraded "dash N" and "dash B" versions of the 2001, which put the cassette outside the case, and included a much larger keyboard with a full stroke non-click motion. Internally a newer and simpler motherboard was used, along with an upgrade in memory to 8, 16, or 32 KB, known as the 2001-N-8, 2001-N-16 or 2001-N-32, respectively.

The PET was the least successful of the 1977 Trinity machines, with under 1 million sales.

icXnx6naTY8a9ufrgJlcA2yBlSmq3BhrmApvLEnU7OR5clJYErT5xF-vaAep8-Xzngmrrg6l4VFw8BpTyAnwFQFQRpouYhb_JHOMGWIKd97frDmQKv4kwWD8eQ-vDkzolXGYSvcdqVL0Efrw98lM0Dw

Above: The “Trinity” of early PC units: the Commodore PET 2001 (left), the Xerox X-2 (center), and the Tandy/Radio Shack TRS-80 Model 1 (right).​

Steve Wozniak (AKA “Woz”) developed the X-1 and subsequent X-2 designs based on the earlier “Alto” design, which was in turn developed at Xerox PARC (Palo Alto Research Center) in the early 1970s. The X-2 had color graphics, a full QWERTY keyboard, and internal slots for expansion, which were mounted in a high quality streamlined plastic case. The monitor and I/O devices were sold separately. The high price of the X-2, along with limited retail distribution, caused it to initially lag in sales behind the other Trinity machines. However, in 1979 it surpassed the Commodore PET, receiving a sales boost attributed to the release of the extremely popular VisiCalc spreadsheet which was initially exclusive to the platform. Though it fell back to 4th place after the release of Atari’s popular 8-bit systems, the X-2 maintained “steady” sales growth. This was largely credited to its durability and longevity; it boasted a lifetime that was up to eight years longer than other machines. By 1985, the X-2 had sold more than 2.1 million units. By the time production ceased in 1993, that number had risen to 4 million. Clearly, Wozniak had designed one Hell of a computer.

UiTGrvgmKlS4qH22wQZ2c5TQOADQWE3y2nhP7Z8k-86xrbxjUTZvNAlFZHL1CIZcClfzUqGg2JV_m3IQfp-cmsaUiQwTgU3QWhiJI9dGBa0jrbZhCULHRHQ-0bgYHiZsc0jFebKC8WSESBoA37TUAGQ

Above: Steve Wozniak, lead designer of the “X-2” computer.​

Finally, the Tandy Corporation (better known as Radio Shack) introduced the TRS-80, which would be retroactively known as the Model I as the company expanded the line with more powerful models. The Model I combined the motherboard and keyboard into one unit with a separate black-and-white monitor and power supply. Tandy's more than 3,000 Radio Shack storefronts ensured the computer would have widespread distribution and support (repair, upgrade, training services) that neither Commodore nor Xerox could.

Despite this huge capacity for sales, however, the Model I suffered from myriad technical difficulties. For one thing, it could not meet FCC regulations on radio interference due to its plastic case and exterior cable-design. There were also internal problems. Keystrokes would randomly repeat at times. The earliest versions of the hardware produced bizarre glitches. Though these were promptly patched, by that time the damage was done. Amongst enthusiasts, the Model I developed a reputation as a “glitchy”, unreliable machine. Radio Shack managed to sell about 1.5 million of them before discontinuing production in favor of their Model II and later, Model III.

A few years later, in January of 1980, as the nation reeled from news that Mo Udall would not seek a second term and prepared for truly contested primaries from both parties, Byte magazine announced in an editorial that, “the era of off-the-shelf personal computers has arrived”.

Whereas before, PCs were seldom, if ever, found in individuals’ homes, they were fast becoming a consumer product, an appliance that more and more everyday Americans would have access to. Though the author of that article admitted that his own PC had cost him $6,000 cash from his local store, he claimed that those costs were “bound to drop as the technology becomes more widely available”. He couldn’t have known how true that prediction would prove to be.

At the time of that article’s publication, aforementioned pioneers like Radio Shack, Commodore, and Xerox manufactured the vast majority of the one half-million microcomputers that existed. As component prices continued to fall, however, many companies entered the computer business. This led to an explosion of low-cost machines known as “home computers” that sold millions of units before the market imploded in a price war in the early 1980s. Below are just a few of the many companies that got in on the “home computer” market.



92jLBh2NwkZjNfOQtZSqWRiPt7jDI1iEFz3EHL8xqWHZKC_wT9SiBzKHBI_UiUJuGYtMpu7locI7Lvs-glRBWDQRdYDgjVUElVrwh7nH9bNpqjh8Tole9vNUTvqK4JQy_gc7RTZMdn3XarU0464zixk

In the late-1970s, Atari was already a household name in the United States. This was due to both their hit arcade games (Pong, Asteroids, Space Invaders, etc.), as well as the hugely successful Atari VCS game console (and its iconic cartridges). Realizing that the VCS (AKA the “Atari 2600”) would have a limited lifetime in the market before a technically advanced competitor came along, Atari decided they would become that competitor, and started work on a new console design that was much more advanced.

Whilst these designs were under development, the “Trinity” machines hit the home PC market, amidst considerable fanfare. Atari thus faced an important decision: should they continue to focus their attention on video game consoles; or should they shift their efforts toward a home computer system instead? In the end, the company decided that the possible reward was worth the risk. They decided to try their hand at designing a PC. Atari did have some advantages over potential competitors in this market.

For one thing, thanks to the rabid success of the 2600, the company had a sound understanding of the home electronics market. Consumers generally wanted high-quality products that would last a long time and were simple and easy to use. The average American had little understanding of how electronics worked, but if the interface was made intuitive enough, then that wouldn’t matter. As a result of these insights, Atari’s first commercial PCs - the Atari 400 and 800, released in 1978 and mass-marketed the following year - were virtually indestructible and just as easy as their consoles to use. The design concept was the same as the 2600 - just plug in a cartridge and go. With a trio of custom graphics and sound co-processors and a 6502 CPU clocked at about 80% faster than most competitors, the Atari machines had capabilities that no other microcomputer could match.

Despite these advantages, however, Atari’s initially strong sales (~600,000 by 1981) slowed once faced with competition from the Commodore 64, which saw release in 1982. Eventually, Atari would retrieve the proverbial toe that it had dipped in the home computer market to redouble its efforts on the video game front. By that time, they were facing increased competition from other firms with other consoles.

1982 would thus prove a pivotal year for Atari.

As soon as the 2600 shipped, work began on its successor. This next-generation console would, in time, come to be labeled the “5200”. Management’s hope was that the company’s recent (if modest) success in the PC market would provide a solid platform for launching their next console. The team responsible for designing the 5200 faced a number of challenges, however.

KxQOZxJI-3xnXe4Q9FjcZl7mH8aNC7ZGhnuXusI-nhhqcYj3yNIwRxyHmV9UuZ_4CE7kh3WGZ6rhI5s0dA0TBWyliVSAwj5L0z8qezlu2ufi4uifpLuzLSdzR4YEdv3dIP7iTLtrjholusIG8GyWpcE

Above: An early prototype of the Atari 5200, its “next-generation” console, after the 2600.​

First, the team had to determine, at a conceptual level, what the 5200 would be. The primary appeal of the console was supposed to be that it would be technically superior to the 2600, boasting better graphics and better performance. However, with how saturated the 2600 market already was, an obvious concern was whether the new 5200 would be backwards compatible. The question provoked fierce debate among the designers.

Proponents argued that consumers would feel “betrayed” and “angry” if their expansive library of 2600 cartridges were suddenly rendered “obsolete and useless” by the next-generation console. Backwards compatibility would also mean that until the 5200 gradually replaced the 2600, Atari could continue to design and release games for the older console.

Those who opposed backward compatibility argued that these supposed strengths were actually weaknesses. Making the 5200 able to play 2600 games would require that hardware components be added to make compatibility with the outdated software possible. This would increase costs that would, in turn, have to be passed along to consumers. Further, management feared that backwards compatibility would discourage coders and third-party game developers from utilizing the new technical capabilities of the next generation console.

In the end, management decided that higher costs (but longer shelf life and happy customers) was a trade-off worth making. The 5200 would be backwards compatible.

Another major challenge came with the limitations imposed by the hardware available in home consoles at the time. For all their bulk, arcade cabinets had the physical space necessary to house enough memory units to facilitate more complex (and engaging) games. Porting these games over to the home consoles often removed features by necessity.

In 1982, Atari received what should have been a golden opportunity. They obtained the rights to develop and publish the console port of Pac-Man, arguably the most popular arcade game of all time. Unfortunately, they were forced to produce two versions (issued on two differently colored cartridges) - one for the 2600 and one for the 5200. The former was far inferior to the latter. To compensate for the lack of ROM space, many visuals were removed, much to the chagrin of fans. The hardware also struggled when multiple ghosts appeared on the screen creating a flickering effect. This version of the game was panned and did not sell well.

The 5200 version was better received and for good reason. Indeed, when the 5200 was officially released in November 1982, Pac-Man 5200 as it was known to critics, was one of the cartridges included with the console. A wise move. Sales of the 5200 increased dramatically.

Under Warner (their parent company) and Atari's chairman and CEO, Raymond Kassar, the company achieved its greatest success, selling millions of 2600s, 5200s, and personal computers. At its peak, Atari accounted for a third of Warner's annual income and was the fastest-growing company in US history at the time. It would, however, have to face an increasingly competitive market and a price-collapse. More on that later.



8gFSzLkQbsNt-H-vHvusODkrK71ho8ggf5QtWrwOIsjHRGMloFxTjSWMR7r4zC2Y1S1kCwNcoGUWwN-NS32vFFbI3qsxH_o7ixbZ_iIF_SOGswzusZtMWtBJApZOU8Mt3HB-0Y8m5aVFaD6qhcjbzk0

Above: The Sinclair ZX Spectrum - Britain’s best-selling computer of the 1980s.​

Sinclair Research Ltd was a British electronics company founded by Sir Clive Sinclair in Cambridge. Sinclair had originally founded Sinclair Radionics, but by 1976 was beginning to lose control over the company and started a new independent venture to pursue projects under his own direction.

Following the commercial success of a kit computer in 1977 aimed at electronics enthusiasts called the MK14, Sinclair Research (then trading as Science of Cambridge) entered the home computer market in 1980 with the ZX80 at £99.95. At the time the ZX80 was the cheapest personal computer for sale in the UK. This was succeeded by the more well-known ZX81 in the following year (sold as the Timex Sinclair 1000 in the United States). The ZX81 was one of the first computers in the UK to be aimed at the general public and was offered for sale via major high street retail channels. It would become a significant success, selling 1.5 million units.
In 1982 the ZX Spectrum was released, later becoming Britain's best selling computer, competing aggressively against Commodore and other brands. Enhanced models in the form of the ZX Spectrum+ and 128 followed; again, to much fanfare. The ZX Spectrum series would sell more than 5 million units. The machine was also widely used as a home gaming platform with more than 3,500 games titles eventually released for it.



3wqPwoCpZMlxdwQclZMYEiTqNEaNocr6R3wlSU9XjScKZKv0b1PzDTnzdOARGaDt1ZKCX1ocYjbVspJIgbrgzxuRMGcOezjk9A2N3d3qWYxCBdff55LEoucYcwdER4sRJunI-Epu7QNlbVpGDvsVLcE
U0uQissd0Wk1sDRAnUbyfNmcxMbuNFnDtyBjlGbHmaj-jevztiqJE0GO4qFsm9Fkb02UYwmepDZt7eCgs7RuzuKxRlY6ZnPxyMZ5dPPb1IKWzSNevQGa1Y48uDcrUrisglLZ4_KL-LUIESgkvtQ5Ibc
Above: The Commodore 64, according to the Guinness Book of World Records, still the highest-selling desktop personal computer model of all time (left); The TI-99/4A, Texas Instruments’ home computer, and the first 16-bit computer to be commercially available (right). Commodore and Texas Instruments engaged in a fierce rivalry and price war throughout the early to mid 1980s.



1A8QVIxxRYfr6cJpRtSy567kL3XB_Loup14_-UpHTxvtRUM80OuYsSR0Ts4tVffZlLb84ckcYueU_AneP3ik9MOxXBT9S3Z5Mk1kBXy3rqX_fPYb1iQyQSsPb3mt7rCXz-eTETwn1eLbMLqSTusC_YY

After moving its corporate headquarters to Bellevue, Washington in January 1979, Microsoft, led by founders and childhood friends Paul Allen and Bill Gates, entered the operating system (OS) business in 1980 with its own version of Unix called Xenix. The company really came into its own (and came to dominate the OS market), however, when IBM licensed MS-DOS from them in early 1981. Other innovations pioneered by the company at the time included the Microsoft Mouse in 1983, as well as a publishing division - Microsoft Press - founded the same year.

Unfortunately, just as the company was really beginning to take off, tragedy struck one of the two men at the heart of the operation. In 1983, Paul Allen was diagnosed with Hodgkin's lymphoma - a type of cancer that affects white blood cells. Allen would later claim, in a memoir he wrote about his time at Microsoft, that his old friend Gates attempted to dilute Allen’s share of the company following his diagnosis. According to Allen, Gates told him that this was because he [Allen] was “not working hard enough for the company”. Allen later invested in low-tech sectors, sports teams, commercial real estate, neuroscience, private space flight, and other ventures unrelated to Microsoft.

Meanwhile, Gates, who was already serving as CEO and Chairman of the Board of the burgeoning enterprise, began to take on an almost exclusively executive/management role, leaving the actual programming and design to others. Gates and Allen’s relationship had been strained for months prior to Allen’s diagnosis, primarily over disagreements about equity in the company and so forth. Gates later admitted that he “regretted” his and Allen’s falling out. The two later reconciled at the end of the decade and resumed their friendship, which lasted until the end of Allen’s life in 2018.



r8YwDyBiEx_JRUVfc7g6mC1fC_Ls2QF2b_wWxXlRAhVJcsvH03bf-Q3gl1z7y4eeFFsdTlmH3rspqPt8CGWE82X76KywtVPaIArgeNZL76tsqPA4BF30uNgLgYe0byMQKJOI2fg37uNd7i67OvuHGGo

Meanwhile, in Kyoto, Japan, key events in the late 1970s and early 80s were shaping the destiny of another iconic video game company - Nintendo. Two of these events occurred in 1979: its American subsidiary - Nintendo America - was opened in New York City; and a new department focused on arcade game development was created. In 1980, one of the first handheld video game systems, the Game & Watch, was created by Gunpei Yokoi from the technology used in portable calculators. It became one of Nintendo's most successful products, with over 43.5 million units sold worldwide during its production period, and for which 59 games were made in total.

Around this time, Nintendo also entered the arcade market with titles like Sheriff, Radar Scope, and most iconically, Popeye. The last of these, helmed by lead-designer Shigeru Miyamoto, would become one of the most popular of the decade.

Starring the iconic “sailor man” from the cartoons, the game made history as one of the first of a new genre - “platformers”. Players control Popeye as he climbs a series of ladders and avoids rolling barrels, with his ultimate objective being to reach Olive Oyl and rescue her from that brute Bluto. Players can also destroy the barrels and make Popeye nigh-invincible for a short time by grabbing a can of spinach, which is placed at a different location in each level. Over 15 million units of the Popeye game would be sold in America; when combined with the home console port - produced for the ColecoVision - that number rises exponentially. Indeed, after the flop of Radar Scope in the US, Popeye can largely be credited with saving the fledgling Nintendo America from ruin.

AgB7aCo2Lb2P1OSL55WxWKasleMXlEhbxRh75dOU-UdDPs6QjnUKZji1O9X9-mcQ3suNs2d2rUh4nrHHh-Ph0cGvfZDntU_27y7AMCcDTE-TXPUAAHQzZAcGGv8sWIJZIwXgHl8b043rYVcUbCY5TCI
1frMLEExvpB5k_jvzukHRLha8v4ZEN0uve0UxMS141sutMJUrrbdAKKoPmrRPU0dJxs8rJ5lanIgj8tkTkLCDxdpyB4z_Rc1pTFVaUsNGsKtjuUeiJHkn7q5MtX3l-zAbEvHv6z6lGOn7TwWQjvyqIU
Above: Promotional poster for Popeye - the first arcade game that would make Nintendo a household name in America (left). The game’s lead-designer was rising star Shigeru Miyamoto. A prototype of the Nintendo Advanced Video System (AVS), built in late 1982 and released in 1983, distributed by Atari in North America (right).

The only downside to Popeye was that it featured licensed characters, rather than originals that would belong to Nintendo exclusively. If the company wanted to continue producing games starring Popeye and his supporting cast, then they would need to keep renegotiating contracts with King Features over the rights. Thus, company management tasked Miyamoto with creating a cast of original characters who could star in Nintendo’s next games. This he would take up with gusto, creating a series of games that would serve as the foundation of Nintendo’s roster moving forward: Donkey Kong in 1982; Mario Brothers in 1983, starring Mario and Luigi Mario, a pair of Italian-American plumbers; and later, The Legend of Zelda in 1986.

All of these and more would eventually be released on Nintendo’s “Advanced Video System”, a redesign of the company’s earlier “Famicom” console, set to be released on the North American market, distributed by Atari, Inc. Though originally plans were made for an advanced 16-bit console that was really more of a home computer-hybrid, these were later scrapped. Nintendo management feared that the keyboard and other accessories would “overwhelm” non-technophiles and “frighten off” the emerging market of “casual” video game fans.

Though Atari and Nintendo would both be severely shaken by the so-called “Video Game Crash” that occurred the following year (caused largely by the glut of low-quality games and an overly saturated market), both companies would, thanks to high-quality control and a loyal customer base at the heart of their operations, emerge on the other side intact. Both would continue to dominate the video game industry until more Japanese companies - Konami & SEGA - and later, tech companies - Microsoft, Sony - entered the game.

Next Time on Blue Skies in Camelot: More US News & Politics from 1982
I think Mario was born out of Donkey Kong but otherwise pretty good
 
Top