Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Computer Programming Is a Dying Art (newsweek.com)
12 points by trickyhero on May 30, 2014 | hide | past | favorite | 31 comments


Even back in the 1980s everyone knew that "fifth-generation" programming languages will make programming obsolote. This article seems to take a bunch of press releases and use them to claim that programming is obsolete in a few years.

The problem with these articles is that programming is AI-complete. Therefore, once we've solved programming, we will also have a general AI, which is 100x more interesting than talking about programming computers. Yes, it'll happen, maybe even in a few decades, but that discussion is very fuzzy.

The article also seems to imply that programming using higher-level tools (e.g. visual programming) is not "true" programming, but I disagree with that definition. I also think that teaching programming is a decent way to teach this "higher level abstraction and design". It's just taught along with learning a particular programming language.


> They’ll just need good higher-level design thinking so they can clearly, logically explain the computer’s task.

... and how do you develop that skill?

While the "search all existing code" approach may lead to improved security (for example) if such considerations are factored into the search, I'm unsure how this would permit us to do something a part of which has never been done before. The article probably does a disservice to the project by projecting it in the wrong direction which hides other very significant benefits it can bring to the table today.

IMO, Wolfram Language deals with the human interface in a much better way than whatever (I think) searching through code can give you. WA generates multiple programs that are possible interpretations of a given query and runs them all, or a pruned set of them, depending on out-of-query context (at least as far as I understand it). This is possible only for those cases without side effects. If some imperative code for calculating prime numbers surreptitiously has an instruction to launch a rocket, how would a naive user know?

In other words, such an intelligent system will need to distinguish between "doing something" and "an instruction to do something" - which is basically imperative languages versus monads. The vast majority of code is written imperatively and this kind of eager execution of multiple possible interpretations of an ambiguous instruction would be unsafe without that ... at least unsafe without the ability of the "programmer" to read the resultant code before accepting it.


Teaching programming is like teaching constructivism in logic. It teaches rigour and produces working things, and a focus on algorithms, even if most people don't see what they're doing as having anything to do with algorithms. Very valuable that. There's the issue that some things are fundamentally out of reach of constructivist thought, but it certainly is an easy way to get pretty knowledgeable.

In my opinion the easiest thing to automate are the higher up jobs in companies. A top executive essentially has to make 2 judgements. First, tell if someone's lying or wrong or otherwise doesn't know what (s)he's talking about without having any first-hand knowledge of the subject. Second, basic (and only very very basic) NPV calculations.

The second problem is so extremely easy my 5 year old daughter is probably pretty close to being able to automate it.

The first problem is the perfect computer science problem. The key is here : "without any first-hand knowledge of the subject". The decisions higher ups in companies make, they have to make without having real knowledge of what they're deciding. So there is no advantage is having knowledge about first-principles, simulation, ... they're generally not going to be beat by any engineer, because an engineer will want exact answers, and they don't exist. So their decision process HAS to be based on statistics, numerical analysis and nothing more. They will obviously disagree, but everyone does (I remember very strong discussions about robots not being able to make and/or combine "non-trivial" auto parts during my vacation jobs long ago).

I did a startup, and we eventually hired an older CEO, mostly because he could get us more funding than we ever could. Now, the guy was everything I expected: zero knowledge about what the company was doing, and not interested in learning more about it (beyond polite interest I mean, he was very professional). Surprisingly, his sales knowledge also was tangential at best. Now I wanted to know what this guy knew that I didn't that made him successful ? But I just couldn't find anything where his knowledge didn't seem trivial at best (outside of physics, which he doctored in an eon ago, but I'm pretty sure he wasn't using 50-year-old high energy physics knowledge to lead the company). Eventually I found it. For a specific funding we had to get a "model" of the business going. And we worked on it together ... so I worked with the guy for a few weeks.

What. The. Fuck.

The guy built a model, in excel, that was just far beyond anything I even thought possible in excel. In fact, it was significantly beyond what I could get done in real programming languages, especially given the time available. He had all components of the business in there, and how they evolved. Employees, equipment, rent, lines of credit, ... and all sorts of things. Assuming we use the most efficient credit line to pay for things, employee onboarding means 2 months of half productivity, X new features per month, X sales calls per month, X% of them successfull, when we get a customer, X days to install, X days to first payment, X months the customer stays, then leaves, slowly ramping up to bigger and bigger customers, ... And what came out is what happened to the company's bank balance. My mind was blown in a pretty big way.

Now it wasn't perfect. Far from it. He got the gist right though, and looking about a year later, he wasn't that horribly far off, close to 100k euro. And more importantly, it got investors on board. I didn't believe the model he made, to be honest, but I was wrong.

But going to work on that data, using AI methods I could (after the fact) build a much better model predicting the business. Of course it isn't an excel file that could be passed to investors, but ...

But here's what I eventually found out his job was : to go out, negotiate. Takes a bit of flair, but everyone can do it provided you do the next thing right : Take the terms on offer, translate them into this excel model of the business and see how they affect it. If they make the business succeed or just barely fail, accept them after a bit of haggling. If they don't, don't bother with the haggling, just politely and professionally tell the guys to get stuffed.

Now of course, this can be automated. I believe this is much easier to automate than programming the company's products themselves, which requires domain knowledge no-one has when the company starts. It is generally a bloody hard problem for any new kind of product.

And middle management, even of programmers, would be a lot easier to automate than even this.

I think that's what's going to happen. Some people, probably financial companies first, since they're really only numbers, will put AI algorithms in charge of departments, which will include programmers. And then the owner will die, simply of natural causes, but the company just stays running. And 200 years later we will be wondering. Hell, I'm betting several such companies already exist, as it's certainly been possible for 10-20 years now to do this.

Because one thing I'm very very sure of. No matter how good you think you can model stuff as a human, you will get your ass handed to you by relatively basic predictive algorithms. Of course, those algorithms, they don't know what they're talking about, beyond the numbers. But neither does management, so it can't be critical for the function.


RE: <The guy built a model, in excel>

Do you have a template of that model?


MUSE will assemble a massive collection of chunks of code that can perform almost any task anybody could ever think of and tag all the code so it can automatically be found and assembled.

Good luck with that.

But seriously, "Art" is the worst term the author could have chosen for anything that is in danger of automation. To the extent that computer programming is an art, how could it be in any more danger than music, painting, or literature?


If you showed a 1970's person in my field what can be knocked together in R or in python today then they would be blown away. In that sense as more and more code reuse through libraries happens the role of programming for the majority changes from low level difficult stuff to simply strapping bits together. I can see this trend continuing until we have a convergence of e.g. ipython and spreedsheets. In that sense everyone will become a programmer of sorts. But hardly anyone will be a 'real programmer'...


No, that 1970's person in your field would have likely shown you what they knocked together with APL and would blow you away. Not all of what we're doing can be termed progress, we're inventing a lot of stuff all over.

Python is essentially a much better basic, there isn't much that I can do today in Python that I would not have been able to do in let's say Gfa Basic more than 2 decades ago.

Our machines have gotten faster, our memories are much larger, and so are our drives. We take networking for granted.

But our programming languages are only a very small step up from what we had back then and our ability to solve harder problems comes mostly from the faster hardware and the bigger ram, the drives and the network.


Here's the thing, if this thing actually works not only are all programmers going to be out of a job but everyone else will too.

I really wish these articles would stop talking about how IBM made a computer than can beat a grand master at chess, how about talk about how computers can't beat my 9 year old at Go. There are lot of problems which computers currently excel at, creative thinking isn't one of them.


You know...that's really one of the best analogies to the NP-complete problem in computer science...if ever a solution is found to them, then all NP-complete problems are solved.

To use your excellent example...computers can't do everything...such as construct a program that fixes all bugs...Not that humans can do any better, but the humans who can control the computers can at least step in and instruct the computer how to fix itself or at least mitigate the damage. Once a computer can be programmed to program and fix all of its own bugs...well, then we won't need programmers, nor will we need humans, either.


Whoever downvoted this has a reading comprehension problem.


Wow,

I'm upvoting this only to encourage someone to write a good article with a similar title.

gavinpc writes: "'Art' is the worst term the author could have chosen for anything that is in danger of automation."

Indeed, The only logical argument for computer programming as dying art is that the art aspect is being drown by armies of low-paid contractors and start-up worker. But I'm not saying that's true. I'm just point the way in case someone wants to write the good "Computer Programming Is a Dying Art". I'll wait tell then to comment.


Yep ! Exactly ! I clicked the article expecting to read a reasonably written lament about the fact that programmers are lesser artists and more artisans these days in the way they approach the act of programming. Amusing thought tho': I wish someone writes an article in Newsweek with the headline "Journalism is a dying art" and then go on to describe news aggregation.


Wow, this is a disturbingly naive article. Who writes and maintains MUSE? How do you express to it what you want done? Who writes the code that feeds into it?

As for computers that behave like human brains, who writes that code? On what planet is teaching a human a task more efficient than writing computer code? How is it an advance in any sense to turn computers into human brains? We have already have billions of human brains in the world and they cost nothing to produce. Let's not waste our time trying to build electronic versions of them.


> Of course, someone has to write the raw code, just as some farmer has to grow the wheat that winds up in the box of rigatoni at Safeway. But, as happened with farmers, far fewer coders will be needed as existing code gets repurposed. Theoretically, just one person on the planet will have to write the raw code that makes a computer perform a certain task.

Just that one person. Sure.


>Who writes the code that feeds into it?

Yeah, they "addressed" that issue in the article -- the code will come from tagging the world's open source projects. Once it's all tagged, it should work together, right? I mean, it's all just code, right?


More importantly, who will have access to MUSE ?


I love how this article tries to equivocate with statements like this one:

In the end, far more people will be able to program without knowing code. They’ll just need good higher-level design thinking so they can clearly, logically explain the computer’s task.

Apparently these folks haven't figured out that the language humans use to "clearly, logically explain the computer's task" is called code.

I don't imagine that programming as we know it will remain unchanged forever, but one of the greatest accomplishments of this era of computing is that we did create any system for clearly, logically explaining the computer's task. I find it annoying that this article downplays that achievement, going so far as to suggest that we shouldn't teach kids to code. Y'know, because the last 60 years of prognostications of "we'll have thinking machines in 5 years!"[1] have worked out so well.

[1] As-near as I can accurately recall quote from a 50's news reel segment filmed with an MIT professor.


Even by the "standards" of Newsweek, this article is supremely stupid.

Current high level languages make so few concessions to the underlying execution engine, that all they really are, are convenient notation for clear and logical thought. The only reason they seem needlessly hard to your average Newsweek hack, is because of his/her own shortcomings in the clear and logical arena.

English, as well as other "human" languages, are (perhaps)good for making highly concrete grunts to ones hunting party, and for making jokes to impress chicks. They were never well suited for describing complex processes and procedures. And complex processes and procedures is what defines an advanced economy. Every area of advanced economy endeavor suffers from having too much of it's workings half assedly and imprecisely "described" in a "human" language. And honestly, for no other reason than that too many "stakeholders" are too bloody dense and lazy to learn anything better suited.

But I guess there are those who think mathematics would be better off without all that weird, difficult notation as well. With mathematicians instead hand waving and counting on fingers, in front of calculators "smart" enough to learn from humans......


> "Writing code is a terrible way for humans to instruct computers. Lucky for us, new technology is about to render programming languages about as useful as Latin"

Jesus. Well, this coming from the same publication that thought the founder of Bitcoin was as simple to find and verify as looking in the phone book and conducting an ambush interview...I can see why they don't sweat the details of accuracy and nuance.


When I was 20 (and still overly ambitious) one fine evening I started a program optimistically called 'the last one'. It was meant to replace me. I figured that I knew enough programming by then to write a program that would take a spec and spit out another program.

Why it failed I'll leave as an exercise to the reader (go build it!) but let's say that 30 years later I'm not at all worried that programming will be dying as an art form any day soon. Because for that to happen we'd have to elevate computers to the levels of creative thinking normally reserved for us humans.

We can make tools to make it easier, ways that will afford us higher levels of abstraction using better programming languages. But those are mostly incremental steps, a better drill versus the previous generation drills.

And those tools will allow us to do more with less input from the programmers. But for now I don't see any revolutionary development on the horizon other than quantum computing and practical implementation of those for general problems seems to be at least several decades out.


Whether programming is dying or not, Mr. Kevin (the author) seems to have figured out how to write articles that generate traffic for Newsweek -- take something popular and predict its demise or that of something which is related: http://www.newsweek.com/authors/kevin-maney-0


Just think of the legal licensing concerns this database would entail. GNU/BSD/ASPL/MPL/whatever else code all living together, and trying to assemble it into a reasonable harmony would be almost as complex as actually compiling viable software.

Then we get into architectural details, like 32/64 bit, endianness, unicode flavor of the decade/ascii/ebcdic strings. Not to mention call by value, call by reference, calling across cross languages, reference counting, value boxing, garbage collection, thread-safety, pointer data alignment, struct packing, dynamic linking dependencies and symbol versioning.

I'm sure someone long ago wrote that we'd have replaceable car parts by now, so that one spark plug works in every car. And we'd just pick out an axel, suspension, steering wheel, paint color, transmission, starter motor, and wind shield wipers from a catalog and be able to build the car of our dreams just by pointing to its parts.


> The thinking is that of the 20 billion lines of code written each year, most of it repeats something that lots of programmers all over the globe have already done. MUSE will assemble a massive collection of chunks of code that can perform almost any task anybody could ever think of and tag all the code so it can automatically be found and assembled.

Excess hyperbole aside, this really is an interesting project. This seems like a logical extension of the existing practice of creating libraries, except that the effort of gluing code together would be automated.

That being said, it's hard not to point out the invariable difficulties:

* A large proportion of the 20 billion lines is probably fixing bugs or working around design decisions.

* Using pre-written code in a new context will unearth a few unaccounted-for edge cases

* Getting code from multiple languages to interoperate without human intervention would be a coup in and of itself.


I was going to address the problems of the article but it seems like other people have already addressed these points.

However I will mention that it seems no one on Newsweek's staff must have an iPhone or browse via mobile because their site experience was terrible.

Page loads and I try to scroll past the image. Can't. Wait a while and finally I can start to scroll. Then the site does a huge redraw and once again I am stuck on the top of the page unable to load. Finally an advert (which I guess is the reason for all the lagginess) takes over the entire screen. It is animating, has an optional video to load, and refuses to acknowledge my touch on the close button.

Is this what the web has come to?


HAHA. Nice going newsweek, this idea, it's been around for over 50 years, and it's always just around the corner according to rags like you, and like practical nuclear fusion, it's always 50 years in the future for anybody who's got a clue.

But just in case we actually can teach computers to understand us, "solving programming" will be one of the least exciting things that'll happen at that time. Other things may include, but may also not be limited to: extinction of the human species, technological singularities, deconstruction of earth and moon to build a matrichoka brain, and so forth.


Click-bait.

For the foreseeable future, computer programming will be a flourishing art and non-natural languages will be the primary way of interacting with computers for non-trivial tasks, if only due to the ambiguities of natural language. There's a reason mathematics still uses specialized notation after all these years.

I do think we can say with relative certainty that Newsweek is a dying company, however.


I am getting tired of these ridiculous articles.

It's like a dream for people who really want to believe that the people off programming computers are wasting their time because they will never have the drive to learn how to do it themselves.

I know that there are a lot of projects to make software development obsolete, but I can't see it happening, not on the timeline of 20 years they're predicting.


Sounds unlikely. I would like to not have to tell the computer exactly how to do stuff. We will see if that will be possible in the future.


Funny. Who do they think will be teaching computers how to understand humans?


Paywalled.


The 1970s called. It wants its uninformed futuristic bullshit back.

"Everyone should learn to code" is a bit ridiculous, given that technology's biggest problem right now isn't a lack of capable coders, but that of the competent people who are out there, few are rarely trusted to tackle real work. (See today's blog post: http://michaelochurch.wordpress.com/2014/05/30/technologys-l...).

The theory might be that AI will render programmers obsolete. The reality is that we've been commoditized and humiliated by process and business nonsense, even though we're nowhere close to achieving AI. What has actually rendered our prospects poor in comparison to what they should be isn't AI but the diametric opposite: Human Stupidity.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: