Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Don't Learn to Code, Learn to Program – But Come Back in 10 Years (johnkurkowski.com)
108 points by dirtyvagabond on Feb 5, 2014 | hide | past | favorite | 92 comments


My paraphrase of this article:

"All of the blood sweat and tears poured into making programming more productive over the years just isn't good enough. No, I haven't done anything, personally, to improve the situation. My plan is to continue bitching about how things aren't good enough until...well, indefinitely, because bitching about other people's efforts is much easier than doing the hard work of solving a challenging problem myself."

Is that too harsh? I don't see a single concrete, actionable proposal in there, let alone anything the author has personally done to fix the sorry state of programming.

I also think it's a huge disservice for him to dissuade his friends from programming until some state of nirvana is reached in programming tools. There are some hurdles to overcome, but a little bit of programming skill can greatly amplify the reach, scope and influence of non-programming talents.


You got my upvote. I can't understand the point of view expressed in this article at all. I particularly can't understand the claim that the accessibility hasn't improved.

Yesterday I implemented a couple of different machine learning algorithms on some test data using python and sklearn. Now, I'm just noob - but there is no way I could have that quickly implemented a machine learning algo on some data 10 years ago. It would have literally taken me months to do the equivalent thing rolling everything myself. And even then my code would have probably taken several days to run for any dataset of any reasonable size because I wouldn't have a clue how to build numpy's optimised data arrays etc...

Plus all this code is just given away... for free... no questions asked. It's so unbelievably awesome it boggles my mind.

So yeah - I don't get why this author is giving all that it's due. It really deserves a crap tonne of due. y'know?


I read it as: "programming could be so much better; I have a glimpse of what it could be like, but don't know how to get there."


I like your reading better. I would have liked the blog better if it had had more of the feeling of humbleness that your words have.

The blog felt like it was saying "shut up till you do something better" where it should have said "Imagine how cool it would be to do something better".


A humble approach is always nice, its always challenging to give an earnest opinion ( after all that's why I read your content, for your opinions) and be openly modest. your (authors) terminology in differentiating programmers form coders is different from my own 'naming'. My wording are often around application development vs systems programming. a sys programmer writes software for humans to control machines ( databases, OS's,tools) where as an app developer is focused on software (computers) that controls humans ( apps that constructively shape human actions) . I liked the article , good post.


I totally understand the viewpoint though that there were 0 solutions in the article. A rant like that is off-putting. Still, I felt it worthwhile to bring awareness to the, ah, malarkey in programming.


This is an entirely pointless false dichotomy. "Coding" and "Programming" are just synonyms of the same damn thing: make a turing machine do stuff. I have no idea why people are suddenly trying to make a false distinction when there was none for decades and decades, but frankly it's stupid. My only thought is there's a bunch of butt sore guys who are upset that learning to make a computer do stuff is nothing special and are now trying to find a new way to differentiate themselves with the moniker "Programmer" and looking to shit on anyone who's "Just A Coder".

So I propose a new term. Turingulate: verb. Making a turing machine do stuff. Typing Python into a file and running it with python? Turingulated! Running a PD graph to create a fuzz pedal? Turingulating! Using Max/SP? Turingulating! Programming because you're so special? Turingulate. Coding to .. uh code some ... uh code... which isn't programming? Turingulator!

So stupid.


Colloquially they are synonyms. Especially amongst people who don't make software. It's not derogatory and not worth getting bent out of shape about.

BUT ... (Bet you saw this coming)

Coders turn detailed specs into code. No real thought required.

Programmers, or software developers, design systems and code them themselves.

That's the distinction I make anyway. I've seen coders at some workplaces. It's a dreary job and I would never do it.


I have seen similar articles with the word "programmer" used in place of "coder" and "developer" used in place of "programmer". I don't think there is any hard and fast rules as to what makes a coder / programer / developer / software engineer.

I think that is part of the problem. The outside world sees these as equal. While after 10 years in the industry I can see the difference between someone who can write a script that will get the job done (for the time being), and someone who can design / write decent code that you expect to be working a year or two later.


I cannot agree more. bored of those nonsense terms distinction if you see on linkedin or the likes everyone has his own software architect, wargods of the mighty developing...

Turingulators!

A Programming Motherfucker


I wish I had the nerve to put A Programming Motherfucker on my résumé...


I notice this kind of article a lot these days. I feel like it's trying to come up with something to write out of nothing.


Programmers write code.


Hold on a second. OOP, markup, APIs, and HTML/CSS/JS are snake oil, while visual programming isn't? Say what you will about the former technologies, but they're being used for productive purposes. Visual programming is vaporware that someone tries to build every 5 years or so to no effect. Functional programming has much the same challenge, but it's getting integrated into mainstream languages. Otherwise, it would likely still be a research toy as well.


I would make the argument that purely functional programming languages have already seen success and are seeing more as time goes on.


http://en.wikipedia.org/wiki/LabVIEW

I believe that's an example of visual/graphical programming, it's fairly mainstream in some areas and not vaporware.

Re: Rest of comment - Filtered site, I'll read it at home.


I used LabView in college and a couple of other visual programming schemes in my work that have since disappeared, and they all had the same drawbacks: If your needs fit tidily within the assumptions made by the designer, they can be very productive indeed. Once you step a little bit outside of that, they turn into a tortuous mess. Can't even count the number of times I wished I could just write a few lines of code instead of trying to figure out a byzantine way to hook up enough icons that I got my desired end result.


I've always been told by labview advocates that "you can of course pull up a text editor and write code." The language and API is certainly a different matter, being a second class citizen and presumably an afterthought.


There's a bit of a stereotype about labview that it is graphically beautiful but its not visual programming because it inevitably results in more typing of text than anything else, otherwise you get a bowl of spaghetti. With textual languages you can interpret it to figure it out, perhaps a billion times slower than the computer, but exactly the same. Visual is a little harder to figure out.

Or in summary I've seen large complicated Labview that's totally write once read never.

You might be interested in screenshots from GNURadio. Same effect. Spaghetti bowl of a schematic where a plain text version would be a lot clearer. Google for a simple WBFM radio with squelch and AFC aka your $5 walkman broadcast radio, and you get something that looks like its launching the space shuttle.


I never personally used LabView (well, maybe in a Physics 101 lab in college, but that was a while back). The people using it in the offices I've worked in have generally been electrical engineers, perhaps the schematic view meshes better with their mental model of computation. It seemed to work well on most of the projects.

EDIT: I should add, when I did look at the LabView stuffs, it seemed to make sense to me (as a reader), but I'm not sure how easy or difficult it would have been for me to jump into and start working on after someone else has constructed a lot of parts.


I suspect that of equal importance to the schematic view, for electrical engineers, is the dataflow programming model. Another dataflow based programming tool, albeit non Turing complete, is Excel.


Excel is not dataflow based and it is not non-Turing complete; you're thinking of something else altogether.

To guide intuition without making things tediously formal, most things that can go into an infinite loop are Turing equivalent.


I have thought of excel as the closest real thing we have to visual programming.

Its a shame that it seems to be based on a "grid" rather than a better data structure. I guess that is why it makes sense to most non programers though.


LabVIEW and the rest of visual/graphical programming needs to perfect the feature for cleaning the code layout. LabVIEW does a good job about half the time, but the other half results in time wasted to make the code legible. Mathematical formulas look nasty, which results in having to use a "Formula Node"[1]. The other major problem with LabVIEW is the speed at which you program is not much faster after you understand the fastest ways to do things, because you are still limited to how fast you can click and connect nodes.

[1]:http://zone.ni.com/reference/en-XX/help/371361K-01/lvconcept...


having used labview I'd say that if that is the pinnacle of visual programming, then it has quite a bit of improvement needed.


Did I say it was the pinnacle? GGP post suggested that visual programming was vaporware. Perhaps for general purpose programming it is, but visual programming itself is not vaporware.


Visual programming has small niches where it fits well. GUI building for example. Actually the query builder in MS Access wasn't so bad, but it was a choice of learning that or remembering "proper" SQL when I used it. Proper SQL seemed more useful.

Likewise with functional programming. I uses small bits here and there when it is appropriate to write cleaner code. I would struggle to build anything of any size using FP.


> "Programming is a notion to extend human capability, by offloading humanly-infeasible work onto a machine. It is the promise of an amplified knowledge worker. This would be worthy to learn."

With this (correct) notion the author continues until he decides that

> "Meanwhile there are few souls looking to evolve text code into something more humanly intuitive. For example something you can touch, you don’t have to read, or that tells you what the computer is thinking, so you don’t have to think like a computer yourself."

I can't help but feel that what the author really wants is some futuristic kind of AI assistant. Decades of programming progress has resulted in numerous abstractions piled on top of one another in order to minimize the amount of "thinking like a computer" that you have to do. But all abstractions are leaky and when you encounter those leaks you will inevitably have to be able to think like the machine you are trying to communicate with.

Why is it acceptable to say that effective communication with people depends on being able to think like the person you're talking to, but demand that communication with a far more different and limited machine should never require you to compromise on your preferred thought patterns? This is the fundamental fallacy of the natural language programming advocates. They want to use another entity to do things for them but demand all communication with that entity be in their preferred language, on their own terms.


Moreover, I'd just jump in here to point out that effective communication with people quite frequently involves "text code".


Is the entire premise of Neuro-Linguistic Programming (NLP) not that you can influence people's thought patterns by using language as code, as instructions?


That's basically my impression, but my understanding is that NLP is not very strongly supported so I'd be leery about using theories from there as if they provided much evidence about the world.

I don't think you need to get so vague though - if you want to communicate a task to a human, you very explicitly use language most of the time (particularly for anything complex). They may not follow your instructions for whatever reason, but the act of communication is done with text.


> Why is it acceptable to say that effective communication with people depends on being able to think like the person you're talking to, but demand that communication with a far more different and limited machine should never require you to compromise on your preferred thought patterns?

Other people are sentient beings with their own personalities, dreams, desires, and attitudes. Recognition of our shared humanity necessitates some level of compromise, respect, and accommodation when dealing with other individuals.

In contrast, computers are unfeeling machines. We made them, and within the limits of physics, math, and our own creative powers, we can make them do whatever we wish. Why shouldn't we strive to make them as accommodating as possible? Why shouldn't I try to create a machine that intuitively corrects my mechanical errors to represent the shape of my ideas?


There's nothing wrong with trying. There is, however something wrong (or at least irritating) with demanding that it magically become like that, without trying, or contributing any insight into how to achieve it.


Just because something is hard doesn't mean it's broken. Nobody would say that say, advanced mathematics is broken because the layman doesn't get it.

My bullshit detector notes that he talks very generally about "knowledge workers leveraging machines". The problem with the phrase "knowledge worker" is it often means "person who sends lots of emails". You don't need to be a programmer to send lots of emails.

Real "knowledge workers", like, the kind of people that actually are working to extend human knowledge and need computers to do it, largely do know how to program to some degree (IE, scientists and mathematicians and so on). A lot of it isn't "professional" grade, but it works well enough for them and it keeps getting better.

Also a tangent: but I'm sick of the whole "coding is literacy" nonsense. Not everyone needs to code. Literacy unlocks centuries worth of knowledge in any domain you want, from contemporary thinkers to people who are long dead. Coding lets you tell a computer how to do specialized things. It's not even remotely comparable.


Exactly. Life is hard. Business is hard. Coding doesn't eliminate work, it just enables new kinds of work. And, it's like coding an app that makes money isn't completely effortless but it is obviously possible.

Coding != literacy. In fact, not everyone needs to go to college either and can still make a living. Mike Rowe's trades-awareness work is fantastic: http://profoundlydisconnected.com/ There are many trades where the average age is 50 because no one wants to be seen as blue collar. Some of these are high income opportunities that are being missed because they're not mainstream fashionable.


> Nobody would say that say, advanced mathematics is broken because the layman doesn't get it.

We do make math more tractable and accessible over time with various reformulations and new abstractions and conceptual tools, though. You're probably using arabic numerals when you do arithmetic, and there are good reasons for that.

(Though heaven knows, if I'm likely to run across an abacist anywhere in this sort of discussion, it's probably here. :)


And computers have been made more tractable and accessible over time with reformulations and abstractions. And at a much faster pace.

People who think that programming is not dramatically easier today than ten years ago just have a short memory. Or they're 25 years old...


I wouldn't say it is easier (or harder), I would say it is different. Before you were having to remember lots of tricks to fit as much information into as little memory as possible. Today we have lots more stuff to remember. Operating systems are way more complex.

At the end of the day, most people are programming at a far higher level, so things are more productive. I have produced a reasonably sophisticated databases / web front end for my work, in Django as a one man team. Ten years ago I believe that it would have taken far more man hours to achieve the same thing. I don't have to remember tricks to compress data, as disk space is cheap now. I do have to know reasonably advanced SQL, and remember a lot about how Django works.


but in the end, most people can't understand vector calculus, just like most people cant understand programming no matter how easy you make the syntax, the concepts themselves can be hard to grasp.


Agreed.

Woodworking is hard. The more you know and the better your toolchain, the better your projects and experience.

At one time, woodworking was incredibly relevant, so there was a good payoff for dabbling in it and you'd have a few tools and get "okay" or "pretty good" without becoming a professional wood worker.

Today wood working is irrelevant, so very few learn it.

So it is with programming.


I don't think we have to worry about "everyone" or even a small percentage thereof learning how to code. They won't. It's not that they can't, but it takes more than a couple of hours writing "Hello World" to do anything substantial and most people are not willing to put in the time, day after day, to learn anything substantial.

That being said, I can't imagine anything more human and more powerful than text. If I want to give instructions to a _human_, the best way I know of is to write them down as clearly as possible. If this works with a complex person, why would it not work with an incredibly simple (from an abstract point of view, not implementation) microprocessor. After all, all processors have an extremely limited set of instructions (even CISC ones).

This desire for something that is not text seems misplaced. After all, text is (IMO) certainly better than pairs of numbers indicating operation and data. Writing is one of the greatest human achievements. To dismiss it in favor of touch or something graphically visual is to disregard its power and superiority to both touch and visual systems of communication.

When designing a system, if I am able to give it commands in English (or any other language), this would be vastly superior to anything visual or touch based that I can think of or that the article specifies (or rather doesn't). To take the whole system of a language like English with its complex idea-expression engine and its vast array of words (1 million++) and discard it is ridiculous. Now, we are not currently programming in English, or anything even remotely close (though many languages' grammar follows from natural language somewhat), but I do believe this is, and has been for decades, a goal that may one day be achievable. This is desirable as humans are naturally inclined towards language and those who practice it can be incredibly good communicators (programmers of reality).


"It's not that they can't"

No, they really can't. Garbage in garbage out. Can't implement an algorithm or follow a protocol unless you understand it, and the world is full of people who literally don't understand how a thermostat works, how to drive safely, or how to implement the NRAs three basic rules of gun safety.

Note that our shared enjoyment of text as per your 3rd paragraph is only politically correct WRT to programming... if you dare to suggest text might also be the superior user interface, the dogpile will begin. Which is sad. Closed minds are always weaker than open minds...

Lets play a game, I'm thinking of a well designed, expressive, long lived programming language developed by a linguist... Its dumped on here at HN for no logical reason, just the usual womens fashion fad explanations. I suspect this would happen to any "English like" future programming language.


I should have been more clear in my writing as I did not mean to suggest most people could program, but that even smart people with the potential to learn it (a very small percentage) generally are not interested in investing the time.

I agree, text (reading and writing) is definitely a dying art, especially in America, although to me it seems to be part of the whole anti-intellectual culture brewing here and in other parts of the world.

Perl?


"just the usual womens fashion fad explanations"

Very classy, dude.


Men can be fashionable too, it's just typically not as fun for us. I would say most of the time the only reason men are interested in fashion is to become more attractive to women.

Edit: I realize this kind of generalization probably offends people. I'm sorry, I just believe that men and women are fundamentally different from each other. Here's my google search findings on the topic:

http://www.lloydianaspects.co.uk/evolve/fashion.html


The author is frustrated by a real and frustrating problem, but he's missing why things are this way.

The "triumvirate of HTML/CSS/Javascript" isn't just a technology. It's a social consensus. It's not actually "designed" in a meaningful sense -- it's evolved.

And things pretty much need to be this way, unless you want to be an island. Even if you have the ambition and the resources to throw it all away and "do it right" from scratch, by the time you finish the world will have moved on and your perfect gem will be obsolete. Real technology is always iterative.

The same constraints apply to programming languages in general, because they truly are languages that human beings use for thinking. Languages and their associated cultures can only evolve so fast. It's not really a technical issue, it's a social issue.

The author's assertion that our "arcane" interfaces persist out of a kind of perverse pride just doesn't ring true to my experience. Programmers cry tears of joy when they get to throw away something old and arcane because they don't need it anymore. And his assumption that visual programming will necessarily be better than textual programming is completely unproven.

Anyone who could actually make programming easy would reap vast wealth. The incentive is there. If it's really a no-brainer to build visual programming tools, well, where are they? There must be something harder about it than you think if we're still waiting.


An interesting thought about evolution is that text-based programming languages are exquisitely evolve-able, because the technology and effort to evolve a language or create a new one aren't too far out of our reach. If you don't care too much about efficiency, you can create a new language from an old one, and pass it around for others to try out.

This may be why there are 100's of text based languages, but I can think of only one graphical language -- LabVIEW -- in widespread use. LabVIEW requires a monumental organization to maintain. Adding a structural feature to the language, requires changing the entire GUI, menus, and so forth.

I suspect that if LabVIEW came out with a text based option for dataflow programming with support for their massive libraries, folks would abandon the graphical interface.

Maybe graphical isn't always better. Maybe text is really the best way to express programs after all. We could wait for programming to become "intuitive," or learn to develop our intuitions.

An anecdote about graphical systems. When I read tutorials for Windows, it's usually a lot of text interspersed with pictures of windows and dialogs. Even videos. Walking someone through a process over the phone is excruciating.

The same process using Linux: Open the terminal and enter this text.

In fact, Windows tutorials are starting to use: Press the start button and enter some text. I find the device manager by entering "device manager" into the start button menu, not by following an "intuitive" GUI.


"A relief from the unintuitive, unscalable, unfounded snake oils like OOP, markup, APIs, or the triumvirate of HTML/CSS/JS. As if these technologies are the best we can do."

At first I was mad. Then I read the above quote and everything clicked into place. This article is hilarious and I love every shrill, indignant, purposefully absurd word of it.


Probably the biggest bit:

"Can’t I just make a website?"

That type of question is where a lot of things go wrong. Newbies ask that because they don't have the experience to break it into smaller parts, experienced people get tripped up because they can't see past the "well, what type of website doing what?"--all the while the question is "what would you use it for?".

It's a bad question, because the asker doesn't know enough to ask a useful one. Our job, then, is to help them learn how to ask the right question.


There is no point in teaching something to people they don't like doing. Its a bit like forcing painting or sketching on people who just don't like doing it. A lot of schools do it, and kids who don't like painting or sketching do it real bad and become the object of ridicule.

Programming et al has a meta science its called 'making', 'tinkering' or 'hacking' or whatever you call it. If there are people who don't have a taste for it, teaching them sorting algorithms is as boring as learning math. Now common people are one thing, most people who join our industry as programmers discover they just want to do something else and go down the managerial stream.

Programming isn't for every one, just like building houses, cars, dams, or an electric grid isn't for every one. In all those disciplines common patterns of problem solution and general science are at play. Chances are if you like chasing a difficult bug and fixing it, you will also like repairing a car engine, or fixing your TV.

I have an elder cousin who runs a shoe shop for a living, he tried very hard to learn programming during the dot com boom era. He gave up in like six months. He just realized that procedurally learning something only because, every one is just doesn't work out, the problem isn't just programming. If you don't like solving problems and are not good at problem identification and opportunity identification, it gets really really boring at one point. He likes to sell things and is exceptionally good at that. Building things is just not for him. And there is nothing wrong about it.

The biggest problem with 'teach every one to code' is you have to first ask 'Do they want to learn coding?'.


The issue with every Visual programming language I have used is that while it is easier to learn for beginners, it is almost always faster to type once you start to understand programming.


I don't think programming should be mainstream, I think software should simple and intuitive enough to let anyone get work done without having to know how to write code.


I agree with the author in regards to coding web sites. It's not enough to know a just a few languages, there's a whole frickin' vocabulary of crap piled on layers of services and frameworks frosted by the JS flavor of the week.

Or just configuration. Everybody wants to reinvent Make and they all do it wrong...


There is a reason we speak and write words to communicate complex ideas rather than hold up pictures. Words are powerful and flexible. They are not going away.


When I saw the title, I thought it was, "Come back after you practiced for 10 years" which doesn't help much.

I used to think the solution was better interfaces, or making programming more accessible to the lay person. I get less sure of this over time. It seems like what we've gained in language ease of use (Python versus C) we've given back in environmental complexity. The net is better productivity for programmers, and more people crossing the divide (CS programs growing by leaps and bounds) but we're far from being there for the Everyperson. I don't see this changing in 10 years without increasing the rigor and logic requirements of our entire educational system.


The author of the article wrote it using words. He didn't make a video or draw a picture, because words were the best way for him to communicate with other humans. For some reason, he wants a different way to communicate with machines, one that is inherently not the way humans communicate with each other.

I also agree with what others have said, that this is nothing but a bitch fest with nothing actionable other than we ought to be creating visual programming tools no one wants or needs instead of doing actual work. Sure, buddy. I'll get right on that.


Man, people are so overly sensitive... it's almost like someone is trying to promote job security.


I don't the author understands programming if he sees object-oriented programming as being in the same bucket as HTML/CSS/JS. OOP is a way to deconstruct problems, just like functional programming, and their beginnings are based on very similar principles.

Any language that fully accepts the object-oriented paradigm lends itself very naturally to functional programming, and vice-versa.


Uh. What? Fully accepting the object-oriented paradigm does not make a language any more likely to properly isolate state and side effects[1], any more likely to have immutability as a default assumption, or any more likely in any other way to encourage referential transparency. As referential transparency is the essence of functional programming, this makes your claim very confusing.

[1] Following OOP design patterns makes a given program more likely to properly isolate state, but this is by no means a virtue of OOP languages.


I think the author should read Fred Brooks's "No Silver Bullet". Many of his arguments are still valid after 28 years. "There is no single development, in either technology or in management technique, that by itself promises even one order-of-magnitude improvement in productivity, in reliability, in simplicity."


"No Silver Bullet" reminds me of the physicists at turn of century claiming that Newtonian physics held all of the answers. He blames the issues in software engineering on bad programmers, instead of questioning his field. His narrow minded, defeatist arguments fail to recognize the full potential of computer science. I don't buy his argument that Java or JavaScript have eliminated most of the accidental complexity in programming. It's like Von Neumann claiming, “I don’t see why anybody would need anything other than machine code.

Computer Science is still in its infancy. We haven't reached the full potential of Von Neumann architecture, let alone the dozens of non-Von Neumann systems that have been largely ignored by academia. Recent advances in neuroscience may open up a whole new model for information processing, such as IBM's SyNAPSE project.[1] Have you watched Bret Victor's, "The Future of Programming"?[2] He does a wonderful job of countering many of Fred Brooks's points.

[1] https://www.research.ibm.com/cognitive-computing/neurosynapt...

[2] http://worrydream.com/dbx/


Yes I have watched Bret Victor's talk. Narrow minded or not, Brooks's predictions held true for more than a decade after the time of writing. I'd like to read your counter-arguments.

I'd say architecture and advances in neuroscience are tangential here. I think the core of the problem is still somewhere else.

* sidenote: Lisp machines weren't at least completely _ignored_ by academia.


I think the reason visual programming hasn't taken off is there is a very real difference between what you can express with language and what you can express visually. Images don't really contain determinate, precise statements or propositions the same way that written or spoken word does. In contrast language can express any thought, no matter how recursive or abstract, effortlessly. For the narrow domain of programming - not typesetting, not rendering and manipulating images - the text editor happens to be an extremely very powerful tool that allows us to tell the computer exactly what we want it to do. It hasn't been replaced because it hasn't needed to be - it is simply the most powerful tool in which to write language.

The other reason visual programming hasn't taken off is because programmers aren't visually oriented, right-brained, intuitive people. Exceptions to this rule abound, but I am certainly not one of them, and neither are many programmers I've known. They've tended to be left brained, linear, linguistic, rational-over-intuitive sort of people - which is why we pick up computer languages so quickly, and it's also why we like typing code into a text editor instead of drawing our program on screen. I know that functionality isn't lateralized to easily and this is an oversimplification - but there does seem to be a dispositional difference between those who are attracted to programming and those who aren't.

But this'll probably change, as everything does. As the author notes FP and lambda calculus have been around since the beginning of computer science as a field and they are just now starting to get mainstream traction. So, maybe, in about 20 years visual programming tools will start to be mature enough to applied to any arbitrary problem., and a powerful visual language will develop that attracts enough visually-oriented people to change or revolutionize the field. But we can't put the chicken before the egg - the tooling needs to mature enough to cause this network effect to occur.


I would say I am pretty visual. I like to draw ER diagrams of database design so I can see whats going on. I usually have a visual representation of the tables in my head with positions when I am thinking of queries. The tables stay in the same position (probably based o the ER diagram I drew ages ago).


We currently have a decentralized mess of competing products. Many of them overlap and do similar things, but they all feed off of each other to improve. It is this competitive environment that improves the products and libraries we use. Anybody can identify a flaw in an existing system and build something to address it.

What this article proposes is a monolithic culture, where everything is centralized and controlled by a few entities. We tried this in the IBM/Microsoft days and it doesn't work. We end up with a single poorly documented tool that offers limited functionality.

The world of programming is more accessible today than ever before. If you are a novice, there are tools for you. If you are an expert, there are tools for you. If you are transitioning from novice to expert, there are tools for you!


I left this comment at the blog:

Perhaps coding will look like minecraft, where you dig physical paths for the water, for the chickens, and hunt down bugs with bows and arrows. Modern tablets with no keyboards do minecraft just fine (my 8 year old and all his friends are completely addicted to minecraft). Also, they can join lan games, and work in the same virtual world. This may allow multiuser programming, no? Weird thoughts. They won't call it programming either, they'll call it something like flowcraft or some such.


I think this type of a concept is what the author is driving at. We shouldn't be manipulating blocks of words, we should be manipulating what those blocks of words create into making even bigger things.


Visual-based programming is definitely something that can help more people solve problems, but it gets complex to manage too (or so we've been told by blogs and others).

Perhaps something like visual-based programming would suit a different mindset altogether.

Truth be told though, programmers have managed to solve the website problem with all the new site-builders out there. They all seem to be "drag and drop", which should have been addressed long ago as a standard for at least the frontend.


The last (natively compiled) program I wrote to run on the Windows OS was done using an almost completely visual programming tool called ProGraph [1]. It worked well, and the small cards containing the code made sure you didn't put too much logic in one picture (much like the character limit Forth imposes on its page size). The system lives on as Marten [2], but only supports Mac OSX now.

While there are some things I like about visual programming, there are also many issues - perhaps the biggest is that it lulls the developer into believing they don't need to understand the underlying computer. I think a similar analogy is how a newbie RoR developer can create a CRUD application without ever looking at or understanding the scaffold-generated code.

I've thought about this a lot (in fact, years before I found ProGraph I tried to develop a system I called FloPro which was standard Fortran flow-charts on the front end and generated C (to compile on your local platform) on the backend), but I haven't seen a solution to these problems in any visual programming system.

[1] http://en.wikipedia.org/wiki/Prograph [2] http://www.andescotia.com/products/marten/


That's quite an interesting product you have there.

Were you only able to build it by writing code itself?


At the risk of some criticism, I would like to make some visual language comments. I am the author of a visual language programming environment called Marten which supports the Prograph visual language. I use this IDE everyday to write commercial-grade software for clients. Marten is in fact written in Prograph using Marten. I have created software in many other languages such as FORTRAN, RATFOR, C, C++, C#, Objective-C, Java, Perl, Python, along with i386 and PPC assembly, so I am very familiar with the difference between programming in a text-based language and visual programming. I found that visual programming is so superior to using text-based languages that I wrote my own IDE to ensure that I could continue to do so. Consequently, Marten stands as an example that not only is visual-programming not "vaporware", it can be an valuable and powerful addition to a developer's toolkit.


This blog post would have been way better without the ranting style of writing.

It doesn't even get to the point (that programming does not have to be text-based) until halfway into the article.

The title and the objection to other people learning to code are at most tangentially related to that point. Just because there could be better ways to give instructions to machine doesn't mean other people should wait to learn how to instruct machines until those methods are developed.


For anyone interested in the prospect of visual programming for web development, please join our newsletter at crudzilla.com ... we are coming out with an update that will attempt to fit a wordpress-like plugin system into an IDE. We think that'll be a good start for a more visually oriented development environment.


You only can learn how to program, if you have learned how to code. So learn how to code :)


Learn to code, but don't stop there. Aspire to write good quality stuff.


Focusing on solving a problem instead of the tool is what motivates me to try new frameworks and actually code.

I agree that the focus has shifted way too much on spec. technologies unfortunately.


Again with this visual programming bullshit


Eye opening at certain points, dragged on at others. Either way I'm more excited to code. I mean, program. :)


I'm neither a coder or a programmer, I'm a code-monkey and proud of it.


Don't learn how to paint, learn how to create art. How is one without the other?


"The current state of painting sucks. You have to figure out which colour mixes with which colour to become what. You have to learn how to hold the brush and what different brushes do. you have to learn different strokes. It shouldn't be like that at all, you shouldn't have to LEARN all those arcane things at all. People have been doing it for CENTURIES and they are unwilling to change due to some perverse pride in all of this. What should be, is that you think of a pretty picture and tada! It appears on Canvas. Without you going through the process of actually making it"

That's is the Author's viewpoint.


Nobody seems to be giving the author the benefit of the doubt - so I'll play devils advocate against your analogy.

"The current state of painting as a way to capture images sucks. It takes years of practice, and even then the likeness created is only ok unless you're a true master. There's this area of research around lenses and films that professional painters seem to scoff at - but I really think we could be doing profoundly better if we can get it working."


Hahah funny thing is I had these thoughts as i typed my last reply.

Btw it's rather interesting how the same cycle begins in photography where old school photographers scoff at post-capture editing via photoshop and say that it takes away from the beauty of the art. So maybe even if we get visual programming, maybe things won't change...

But still, I gotta ask, how can you learn programming without learning HOW to program? The HOW is where code comes in. Yes, we'd all love the magic of making the computer do as you wish with a point of a finger. But until then, why not enjoy the process of taking a problem, thinking of a solution and figuring out how to explain that to the computer?


is it just me or is this guy completely wrong. In my opinion languages aren't that confusing or unwieldy, the hard part about programming is the logic, not the languages.


Is it satire?


It has that feel, doesn't it?


I think there is some obvious satire here. And, I think not everyone has picked up on it.


It seems that folks here are reacting to the angst of the author as well as the literal meaning of the language used to make his point instead of contemplating what the author is trying to actually say.

Otherwise, why get hung up on definitions of "coding" vs "programming" or concepts such as "text" vs "visual"?

I think this post is an expression of the frustration that we have all reached at some point (if not many points) along the way while using any language or solution or architecture: this sucks and there's got to be a better way to get shit done!

How many times have you been working on project "foo", ran into a problem and employed solution "bar" (or better yet, created solution "baz" on your own) in order to solve your problem...only to find yourself derailed into a swamp of limitations with "bar" (that everyone forgot to mention in that awesome blog post you found)? Or, how many times have you lied to yourself and said "I'll improve upon the limitations of 'baz' in the v2 release" of your home grown masterpiece only to never return because you're actual goal is to finish the "foo" project?

The problem might be the tools or the user or the application or even the philosophy...or it might be a combination of that list plus some I haven't mentioned. Perhaps these problems have already been solved and the lessons have been lost? Or, perhaps more analysis is needed to come up with a new way around this issue entirely? I certainly don't have the answer right now but this article made me take a step backward to survey the landscape (or the wake?) of the most recent 10 years of "disruptive" (read, self-serving) technology and I feel an undeniable sense of dissatisfaction with it all.

Maybe that desire for something better just means I'm a programmer and not a coder? I don't know but I won't avert my eyes from the problem no matter how obvious it is to everyone simply criticizing the article. Awareness is the first step to understanding the problem you are trying to solve and I see people recoiling from that. Denial is a sign that a belief (or end of thought) is overruling the critical thinking process and to me that's acceptance of the status quo (not just dangerous but stupid too).

I did get a good chuckle at the metaphor provided by xerophtye's comment "you think of a pretty picture and tada! It appears on canvas!" The truth is always funny so perhaps this comment encompasses the entire article but I think it oversimplifies what the article attempts to drive toward. Sure, comprehension of the building blocks (or, the colors) allows you to build bigger and better buildings (or paintings), but, if that were truly the case, why aren't things improving? Wouldn't all of the so called experts chiming in have already built bigger and better buildings? And if so, what are they? NoSQL? Twitter Bootstrap? Ember or AngularJS? Bitcoins? Animated CSS? All I see are copies of the original idea done "my way" which is "better".

Amateurs borrow, professionals steal. And that is the process improvement methodology I see employed today.

If everyone thinks they know their craft so well, inside and out (and, for safety, let's say I don't), how might you take that knowledge (and a step back) and change/improve upon how things are accomplished today?

I think that's the core of this article.

To put it another way, if the acquired knowledge to accomplish tasks via "programming" is only useful and self-serving within the realm of the library/language being used, how is that moving the entire ball forward?

I think the solution implied here is that a re-examination of history and more experimentation with different architectures or applications of the existing building blocks might get us over this hump we're facing.


Why 10 years? Given that we've known what a mess the software development world is for 50 years and nothing has improved, what makes you think anything will be better 10 years from now?


Might be a reference to the famous Norvig essay: "Teach Yourself Programming in Ten Years"?


No, I'm not that clever. It seemed like a reasonable time to check back in and see if the revolution happened.


Anytime anyone tries to spin any of the following - coding, programming, software engineering, software development - as distinct things, and expects me to acknowledge their special definitions that they probably made up, I kind of lose interest.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: