I worked at PARC (in the Learning Research Group) for most of a year in 1974, and later at Apple for 10 years. In both places I worked with Dan Ingalls and other Smalltalk folks. At Apple I was very involved in projects that led to various HyperCard related follow ons including AppleScript.
This article does not ring true to me at all. The problem with making a big fraction of users into programmers is that we don't know how to do it. The LRG at PARC and then various groups at Apple tried every way they (we) could think of, and other ways have continued to be invented and tried. So far none work. Hypercard was indeed the most accessible development environment, but only a small fraction of Hypercard users ever wrote code (maybe 5%).
Apple has continued to make user programming of Macs as easy as they conveniently can (as far as I can tell having been out of there for a long time). iOS has a different goal -- making devices safe and usable -- which is intrinsically in conflict with maximum programmability. Even on iOS there are plenty of "user programmable" apps.
I think part of the problem here is that many developers take themselves and their developer friends as "typical" but that is totally not the case. This really treats most of the population with (unintentional) disrespect.
5% seems pretty good to me actually! The number for iOS is probably, what, like .01%?
I have a soft-soft for Hypercard -- it was the first programming I did, back in middle school, on the school computers.
They were those old black-and-white macs with Hypercard, and a variety of stacks. We all knew the trick to turn on author mode (level 5 or something? it's been a while!), and from there you could explore and mess with existing stacks.
I still remember the thrill when I first discovered loops, and that I could write a script that would move a character on the screen in response to keys being held down. Then I discovered if statements, & made enemies that chased the main character. Honestly, I FELT LIKE GOD. I had created life.
No instruction, no books, it was all just poking around.
There's really something magical about an environment like that, and it's sad that we've gotten away from it.
(Maybe minecraft or something is the modern equivalent?)
Love your reminiscence. Really captures that awesome moment when the basic idea fully takes hold.
I do wonder why Hypercard's modern clones haven't been more successful. Maybe there are just too many relatively easy interactive environments for any to stand out. For example any browser (with the debugger open) is a pretty amazing interactive graphical development environment -- now with full 3D rendering etc. if you get ambitious.
Excel is, by far, the most successful programming-for-the-masses environment ever created. I'm not talking about VBA, either -- Just the sheet and formulas are a brilliantly accessible functional programming environment.
"Apple has continued to make user programming of Macs as easy as they conveniently can"
I don't see how that is true. XCode is not installed by default. You can use AppleScript and Automator to automate some tasks, but you can't really call that programming.
The source code to the whole system is closed which limits discover-ability.
I remember in my childhood playing Gorillas on my DOS machine. I pressed a button and suddenly all the code for it popped up. I learned that if you change a part of that, the game would change.
There is NOTHING like that on a Mac install today. The closest thing is in the web browser.
> You can use AppleScript and Automator to automate some tasks, but you can't really call that programming
If Hypercard was programming, so is AppleScript. So is Python. So is bash. I think we're talking about the general "programming" - since I also see Excel included alongside Hypercard in this reminiscence - not the more specific (and arbitrary, IMHO) programming vs. scripting language divide.
I loved Hypercard and was also a regular in the AppleScript world for a decade (and made a good part of my living on it at the time), but there are far more options for anyone remotely motivated to program today and with the exception of XCode (probably the least accessible option) they are preinstalled and freely available. And if downloading a free development environment with a simple installer is too great a burden today, I don't think you would have gotten anywhere in the good old days.
I don't really see the burden today - we have more options than ever, and many are available out of the box, and the rest almost universally free. I can automate my OSX environment with AppleScript, JavaScript, Python, Ruby, and bash (and doubtless others) out of the box. I can build command line applications out of the box. I can build Web applications with GUIs out of the box. I can easily extend AppleScript or Python (and, I'm certain, Ruby and bash) to allow me to build GUI applications.
Programming in the 80's was different, for sure, but I have a hard time understanding the sentiment that it was somehow more accessible. Sure, BASIC was preinstalled on some platforms. Still, most platforms today have more, and better, options out of the box than any platform of the early 90's. Did you write C in the 90s for Mac OS (if we're accusing Apple of being inaccessible because of the lack of preinstalled XCode at the moment, then the 90's were far worse if you ever had to pay for THINK C or CodeWarrior).
This is sort of tangential to the other issues being discussed (which I agree are important), but compared to the era of software distributed on physical media, today "Xcode is not installed by default" does not mean that much given that installing it only takes a few clicks in the App Store. Yeah, it's a few gigs to download, but Apple has always been forward-looking, and on a fast internet connection that's only five minutes' wait.
Not to mention that, also unlike then, it's easy to access tutorials and videos galore on the Internet to show you how to get started with development.
It would be interesting to mandate that all Mac Apps be written in Javascript so users could modify them. However this seems ambitious. Even mandating that all apps be scriptable seems to be a bridge too far.
Note that Javascript is supported in addition to AppleScript as a user level scripting language.
Much of the Mac OS is in fact open source -- see Darwin and other projects. However having the code for the OS available -- or having XCode installed -- does nothing for user programmability.
This comment is a good example of why it doesn't work for us developers to see ourselves as typical users.
Apple still supports Widgets, which are canned JavaScript built to run in a special Dashboard app. (Does anyone still develop Widgets?)
But this still misses the point about HyperCard, which was that Hypercard was its own toolchain. To get started with it, you didn't have to:
Install a separate editor
Learn a separate editor
Install a build system
Learn how to build a project
Learn how to share code
Deal with dependencies
It's the no-separate-toolchain feature that made HyperCard so accessible, and which was influenced by SmallTalk.
Modern programming is a nightmare in comparison - including modern app programming.
Just because Apple uses Objective-C doesn't mean it's approaching app development in a SmallTalk-like way. Getting an app out of Xcode and into the store is a horror story of provisioning profiles, debug vs production builds, sandboxes, entitlements, and so on. Obviously people manage to fight through this, but it's a long way from being friendly and accessible.
IMO only VBA gets close to being a successful no-separate-toolchain environment - which is one of the main reasons VBA became so popular.
(You could argue Python is, but I don't think it's equivalent, because unlike VBA and HyperCard the first thing the user sees is the editor, and not a product that's obviously and immediately useful, but also happens to include a code editor.)
Developers are definitely not the typical users. The hoops you have to jump through now are much greater than Apple II Where you got a BASIC prompt.
I think the reason people are scared of such things is not because they are incapable of learning the tools, but because developer culture has a good reason to make it look hard. Otherwise, why are we payed so well?
> making a big fraction of users into programmers is that we don't know how to do it
It depends on how you define "programmer." At Microsoft in the late 90s/early 00s we were very happy that there were so many Visual Basic programmers, but several million of those included people who would not self-identify as a programmer. That is, they were people who would write VB oneliners in either Excel or Access; people who would make or modify "keyboard record & replay" macros; etc.
I agree that trying to make everybody a full-blown systems or application programmer seems like a very hard problem.
By "programming" here I just meant someone who writes (or modifies) even a small amount of code. We found that this was always an extremely small fraction of users, even in the easiest cases.
I certainly agree that self-identifying as programmers would be totally the wrong criterion.
Consider the number of people today who edit a URL in their browser location bar (for example to delete unnecessary trailing information if they are going to email it, or to try to fix it if it doesn't work). I bet a good study of a representative user population would find that only 10% or less would do this even if it was suggested.
If someone is willing to do that I'm willing to consider them a programmer in this context. They have made the basic leap to understand how to map between text and behavior, how to debug, etc.
Not sure about it constituting programming, but that (awareness of being able to manipulate the resource portion of a URL) does strike me as a good litmus test of general power-userdom.
Funnily enough, starting from Yosemite, Safari has started to hide that portion until you click into the address bar.
I don't think it's even a matter of definition. Most people just don't want to program computers. They want to just sit down and use them. If they really did want to write programs (in any way you define that term), they would: the tools are available. A few people explore macros, etc. provided in sofware such as Excel, but most people don't, and it's not because of the tools, it's because they just don't have the need to do it.
Those of us who became programmers in part because of discoverable built-in programming environments would argue that even if only .5% of users do any programming, the inclusion of those features is worthwhile because the other 99.5% of users will benefit from the code written by the few who were inspired to become programmers.
Design for extremes: if you make stuff for people who hate programming, that will make it easier for people who love it. You may end up coming up with ways to boost productivity (and fun) for people you never even thought about before:
https://books.google.com/books?id=idNhCcrANP0C&lpg=PA57&ots=...
Like the philosophy of the book, though the examples are really old school. I don't think the operative dimension is "love" vs. "hate" -- more like "can easily handle" vs. "find difficult".
But I don't see how to apply it to programmability -- and I don't think Apple has found any way. How could you make something programmable for people whose eyes slide over a URL without being able to see its parts? (Note: they maybe are willing but they can't, just like grandpa can't make his hands stop shaking and get the key into the lock. Naturally after enough frustration they come to "hate" whatever it is.)
Those roles weren't so well defined as they are in the marketplace now. But basically, less resistance than the user base. People working around computers are already somewhat self-selected to be more comfortable with software. Plus of course they knew being able to understand code would increase their marketability.
This article does not ring true to me at all. The problem with making a big fraction of users into programmers is that we don't know how to do it. The LRG at PARC and then various groups at Apple tried every way they (we) could think of, and other ways have continued to be invented and tried. So far none work. Hypercard was indeed the most accessible development environment, but only a small fraction of Hypercard users ever wrote code (maybe 5%).
Apple has continued to make user programming of Macs as easy as they conveniently can (as far as I can tell having been out of there for a long time). iOS has a different goal -- making devices safe and usable -- which is intrinsically in conflict with maximum programmability. Even on iOS there are plenty of "user programmable" apps.
I think part of the problem here is that many developers take themselves and their developer friends as "typical" but that is totally not the case. This really treats most of the population with (unintentional) disrespect.