“Computers are ubiquitous in modern life. They offer us portals to information and entertainment, and they handle the complex tasks needed to keep many facets of modern society running smoothly. Chances are, there is not a single person in Ars’ readership whose day-to-day existence doesn’t rely on computers in one manner or another. Despite this, very few people know how computers actually do the things that they do. How does one go from what is really nothing more than a collection – a very large collection, mind you – of switches to the things we see powering the modern world?”
Unlike a lot of programmers who prefer programming to remain a secret magical art, I think it will be a fact of life in the next 50 years that programming will just be something people do at a basic level as part of everyday life. My guess is that something like the world of Neal Stephenson’s “Diamond Age”, in which your average first world citizen would have technology that could build physical stuff. We have 3D printers becoming a lot more affordable, for example.
I don’t think teaching a programming language should be the centre of “writing for computers”. People would do better to learn programming through understanding algorithms, and structural design. Most programming languages today, just like “flat design”, are mostly superfluous and any special “features” are just fads that gain prominence due to being different from the past, but not introducing new ways to think about design.
This seems naive. The average person is far less intelligent than most intellectuals actually realise.
It is not purely due to poor teaching that first year university programming courses have immense failure rates.
It does seem to be beyond most people.
Thereby completely missing the point of my comment.
Thinking algorithmically and structurally should come before learning programming languages. A lot of first year courses assume people already know how to think properly and that programming is a matter of writing code.
What I suggest is that it shouldn’t be taught at university first but in high schools and possibly earlier.
Using university failure rates as proof is lazy, and is probably indicative of a mind not suited for good programming either. Programming requires foresight, hindsight and lateral thinking.
I had the same experience in high school comp science as well; students would completely fail to see obvious connections, just as they do in mathematics.
With still higher level languages, this problem will be lessened, but it does seem as though a lack of desire or possible inability to think logically does exist for what may be the majority of people.
If they can’t/won’t learn algebra, how are they to learn coding?
Organizations like the Khan Academy shows that children, of all stripes, are willing and can learn algebra given the right teaching environment.
I don’t think thinking logically is as fundamental to programming as it is thinking algorithmically. I’ve known many intelligent people, much more intelligent than me. But they can’t program for shit. Logic is a red herring in programming and is really only a problem “in the small”. Programming happens “in the large”.
Then, if this particular art is lost on the intelligent, and requires more algorithmic thinking, how is that to come about?
More importantly, how do you make people actually *want* to program?
Why is it that most are happy to use computers for leisure, but are repulsed by the notion of understanding them more deeply?
Just like the other sciences, there seems to be a significant desire to avoid anything to do with actual analytical thinking.
Well, thinking is an expensive activity, in terms of energy requirements.
Otherwise we would all be fluent lispers…
Kochise
Bingo! A point utterly missed by the article author and subsequent Ars commenters, who are all too busy debating whether new programmers should be taught Python/Java/C/C++ first.
The huge irony is that Seymour Papert was successfully doing exactly this with 6 and 7 year-olds several decades before the average Arsian was even assembled. LOGO was never about teaching kids to program: it was about teaching them to think, and how to learn, and how to learn how to learn. Gaining the ability to assemble useful programs along the way was merely a side-benefit.
The wonderful thing about LOGO was that it avoided all of the mindless bureaucracy and special-case behaviors so rampant in ‘popular’ languages. Learners weren’t misled into believing type declarations, memory management, conditional and loop statements, and other such tedious details were what programming was fundamentally about. Directing attention to those is like teaching a student every single irregular verb in the English language before explaining what a verb actually is, or demonstrating how the vast majority of logical (regular) verbs operate.
Being essentially a better-looking Lisp, LOGO was incredibly parsimonious: the only core features were words and values, and everything else was expressed in terms of those structures. Thus abstraction, which is the real key to becoming a programmer, is naturally the second or third thing taught: it’s simply a matter of defining new words in addition to the words already provided by the language. Neither was making mistakes seen as something to be ashamed of: instead, it was part of the natural learning process: write some words, run them, figure out what’s not working and fix it (i.e. debugging), and learn from the whole experience.
Papert ultimately failed, of course, but not due to flaws in his core tools or techniques. Rather, his objectives were undermined and ultimately buried by the heinous politics of education: technophobic teachers fearful on one side; programming priesthood threatened on the other. Programming became a elitist course for special students only; computer education in general degenerated into poor-quality ICT training churning out third-rate Office monkeys.
Remarkably rare qualities in the modern profession, alas. Probably not aided by the silent degeneration of Computer Science into Software Engineering and from there to bottom-of-the-barrel Java diploma mills, but that’s another rant…
As someone who learned Logo at the age of 8, I can attest to this.
In France in the first year of university classes are *big* whereas they were 30-50 people in high school, students aren’t supervised like they were before, they live alone for the first time, etc in these conditions the high failure rate has nothing to do with intelligence, more with lack of self-discipline|maturity.
But the same lacks that makes them fail that first year of college/university are the same problems once they consider writing a complex program no matter what the language/environment.
Code that do real work tends to be complex, and even the simpler programs still need the programmer to consider how to handle things/events when something goes wrong with inputs/hardware/communications.
A guy at my school began his Australian undergraduate medical degree at 15. Despite being in the 99.9th percentile he failed every subject in first year due to immaturity. Luckily he was allowed to re-enroll after two years.
Edited 2012-12-28 06:35 UTC
With smartphones, tablets and app stores that let you click and install software to your heart’s (and wallet’s) content, I seriously doubt it.
If anything, it’s far more likely that programming will increasingly become the domain of professionals and serious hobbyists. Think about it: in the late 70s and early 80s you switched on your computer and were greeted by the BASIC prompt that encouraged you to explore and try things out. Before that, everything was even harder because you had to buy a kit and actually assemble it yourself — and decode the LEDs that served as the display and go the the User Group meeting and trade software (you had to write) with fellow enthusiasts.
Of course programming is a lot easier now, with plenty of languages to choose from, IDEs, online tutorials and whatnot. However, smartphones, tablets and walled gardens are creating a generation of pampered users. Not to be insulting, but when everything you need is usually one or two clicks/taps away, I can’t see a lot people getting interested in programming just out of sheer curiosity.
RT.
Edited 2012-12-29 15:31 UTC
I like sometimes to think that the “art” of programming resembles some kind of “magic”, and us, the programmers, are those who (try to) master the ability of thinking in programming terms and “bend the reality” to what they desire.
I think the point is being missed: whether it’s a good a idea or not, it’s probably going to happen.
Congratulations to all those telling it how it is, but that is near to useless in predicting what will be. Even if most people are psychologically not disposed to maths and the sciences, the fact is most people in the first world today on average knows more about basic maths, the sciences, and even literacy than people just over a hundred years ago. Average intelligence increases.
No, not everyone is going to be a programming genius. That’s not my argument at all. My argument is that, for good or bad, basic understanding of programming will be expected. Just like basic maths and basic literacy.
One reason I mentioned previously – the potential availability of private manufacturing like in “Diamond Age”.
Another reason is the trend towards automation in all physical jobs and the outsourcing of all other menial low skill jobs. Pretty soon, “entry level” jobs will be about being able to program basic automation of tasks and maintenance.
I think it’s going to happen, and either society keeps up by updating the education system, or risk widespread unemployment and unrest.
kwan_e,
Our culture is openly embracing technology as a means of life. I’ll side with you in that for me there’s no doubt that kids are smart enough to learn how to program it (given the proper education foundations, which is by no means a given).
However there are some roadblocks too. Children are being introduced to technology as fashionable bling instead of programmable tools. Worse still, today’s popular consumer devices are becoming *less* programmable than their predecessors, which are threatening to displace open computing technologies at home.
Looking past these roadblocks, I have to wonder if there’s any need for a significant percentage of the population to know programming. What would that get us? If half the population could program, wouldn’t most of them be overqualified for the menial jobs they end up getting? Many of us are already overqualified today, meaning our advanced degrees are not being put to great use.
I have addressed that problem specifically. Menial jobs are getting automated – slowly for now, but it’s happening and can only accelerate.
Warehousing is becoming automated obviously (eg Amazon’s robotic warehouse with robots zipping around at 40km/h). Industrial manufacturing is getting better automated. Shopping centres are moving towards self service, and more and more people are just ordering things over the internet (the dotcom dream wasn’t dead, just resting). Google is developing driverless automobiles. Roomba. The list goes on.
Today’s device aren’t too programmable, but as we can see, things like the iPad and Android are able to make the possibility of programming available to a wider group of people but that’s beside the point. Programming will become a menial job.
I’m not saying the average person will write in Java or C++ or C# or one of the functional languages. There will probably be domain specific languages that are less powerful that would be easy enough for it to be common knowledge like maths is today.
kwan_e,
“I have addressed that problem specifically. Menial jobs are getting automated – slowly for now, but it’s happening and can only accelerate.”
Yes and no. The price of robotics obviously has to continue to drop for them to become more prevalent. In theory we might get rid of most jobs and have robots to do all the work. Some might even consider it a utopia. However if we don’t reform our current economic models, it might easily result in mass joblessness. The thing with robots is that production can scale WITHOUT creating enough new jobs to replace those that had been laid off.
For example, a highly successful robotics company might eventually employ 100K engineers to build machines which will do the menial work of 50M people.
There’s certainly no need for 50M engineers, and even if we pretend there is, there would not be enough money to pay all of them good wages.
“Today’s device aren’t too programmable, but as we can see, things like the iPad and Android are able to make the possibility of programming available to a wider group of people but that’s beside the point. Programming will become a menial job.”
Can the ipad be programmed without a computer?
Can an android?
You think employers care, or the government cares? They’re going to push for this no matter how many people lose those jobs. They’ll just redefine unemployment yet again.
Yes you are right, it’s going to require reforming current economic models, especially employment models. But employers don’t care. They always want to get rid of the human element for cheaper, non-unionized, labour if they could. They haven’t cared in the past when the higher ups made a bad decision and covering it up by laying off tens of thousands of low level workers.
I think one of the solutions has to be a rotational workforce. We have to be done with the idea that everyone has to have a job every day of the year and that welfare is bad. You can’t force people to find jobs that don’t exist, and you can’t force employers to create jobs when they don’t need them or can’t afford them.
This leaves us in a situation where the only jobs left are the highly skilled jobs that are too difficult to automate.
I personally don’t have a problem with welfare, but a lot of people do, so why not cut people’s working year short and have workers do essentially shifts a few months at a time. They’ll still be “earning their keep”. Robots aren’t going to complain about how they have to work and how others are on welfare, are they?
What does that matter? I’m talking about potential 50 years in the future. It’s obviously part of a trend.
kwan_e,
“You think employers care, or the government cares? They’re going to push for this no matter how many people lose those jobs. They’ll just redefine unemployment yet again.”
If we’re really conceiving doing away with an employment based society through obsolescence, then we as a society really should strongly reconsider the very existence of for-profit corporations as well. Because if we really do end up with machines taking the majority of jobs (bit of a stretch, but I’m willing to roll with it), the means of production will no longer be dependent upon ordinary people as employees, there’ll be no corporate ladders to climb either. You’ll either be an owner, or your not, there will be very few opportunities to transition from one to the other because most people will have no where to work. Since work would mostly not exist, working would become something people do for their own pride & entertainment.
Under such circumstances, society would probably be better off transitioning to public ownership where the technology exists to serve the general public rather than private profit based interests, which would likely have collapsed into a handful of all powerful oligopolies.
“I think one of the solutions has to be a rotational workforce. We have to be done with the idea that everyone has to have a job every day of the year and that welfare is bad. You can’t force people to find jobs that don’t exist, and you can’t force employers to create jobs when they don’t need them or can’t afford them.”
That’s a logical solution to unemployment, especially considering how employees are working longer hours each year. Within the past decade, US law was changed to specifically exclude IT workers from federal overtime pay requirements so that businesses are legally entitled to demand longer hours from us with zero additional pay (forget time and a half). So we’re kind of moving in the opposite direction.
“What does that matter? I’m talking about potential 50 years in the future. It’s obviously part of a trend.”
I’m a bit confused… it matters because you brought them up as examples of that trend “…the iPad and Android are able to make the possibility of programming available to a wider group of people…” I find them ironic choices for illustrating the point because technology could be less user accessible in the future.
Incidentally, the FSF just sent an email about it’s campaign to fight restricted boot devices, if anybody’s interested:
http://www.fsf.org/campaigns/secure-boot-vs-restricted-boot/2012-ap…
Edited 2012-12-29 04:49 UTC
I think the problem you highlight is actually exacerbated by certain IT jobs being considered as above “entry level”, if not “elite”. IT administration is kind of like the janitorial equivalent in the eyes of the corporate types, but it requires a great amount of training and time. The sooner those IT jobs no longer require university degrees, the better.
With the momentum, IT jobs can become unionized again. Employers will just have to suck it.
The devices themselves may be less user accessible, but the trend I’m talking about is programming itself being available to people without going to university. As I understand it, the iPad and Android created a market for programmers that didn’t require university degrees and established companies.
Yes, most apps are of poor quality, but it doesn’t matter. The opportunity and market is now there, and no matter how many restrictions are put in place, you can’t deny that programming itself is being opened up and your average student will start seeing programming as a required basic skill.
Uh oh, cue the “RMS is a fanatic” slogans.
kwan_e,
“IT administration is kind of like the janitorial equivalent in the eyes of the corporate types, but it requires a great amount of training and time. The sooner those IT jobs no longer require university degrees, the better.”
I’d say that’s already the case. When institutions are pumping out so many professional degrees per year, they become requirements for jobs which previously did not require them. Back in the 90’s, employers would hire anyone who was able to do IT administration regardless of degrees since most candidates didn’t have one. I believe the higher degree requirements today is a result of supply and demand rather than the increasing difficulty of the work. If the supply were to increase substantially as you predict, then won’t most employers just add more requirements to filter them out?
“The devices themselves may be less user accessible, but the trend I’m talking about is programming itself being available to people without going to university.”
Ok I see, they created new markets, and hence new openings for programmers.
“Yes, most apps are of poor quality, but it doesn’t matter. The opportunity and market is now there, and no matter how many restrictions are put in place, you can’t deny that programming itself is being opened up.”
I donno, it’s still an incredibly ironic example to me, I’d have picked the raspberry pi or it’s ilk since it doesn’t run a walled garden.
The old way of doing things are going to die out as young people start going off and do their own companies, which is also happening. There’s been a few voices of late that have said university degrees are useless, so we can see the potential zeitgeist of the next 20 years.
It’s completely ridiculous to try and make predictions based on short term trends, as TM99 thinks. Short term predictions are harder to make than long term, just like how weather is harder to predict than climate.
I think they’re apt examples precisely because they’re counter-intuitive. That’s basically been the history of computing since it started. Every naysayer has basically been wrong about where the next development comes from.
I picked them as examples because they have staying power, and that to me seems to be the most important factor. Culture doesn’t work on logic or rationality, but on durable popularity. Unfortunate but unavoidable.
I have hope of raspberry pi but it is just not in the position for it to make any predictions on how it will fair.
kwan_e,
“The old way of doing things are going to die out as young people start going off and do their own companies, which is also happening. There’s been a few voices of late that have said university degrees are useless, so we can see the potential zeitgeist of the next 20 years.”
Is this really a fundamental change, or just a market cycle? In the growth phase new markets can grow insanely quickly. There’s plenty of room for many players to grow, but eventually we reach a market plateau and the only way to grow further is for corporations to merge and/or others to drop out of business. The tablet market is still somewhere in the growth phase now, but once it becomes mature I think it’s going to look more like business as usual. Who knows, predictions about the future are all just educated guesses. There’s no right or wrong answer in the present.
“I think they’re apt examples precisely because they’re counter-intuitive.”
I still don’t get how you conclude that these new tablets are opening up programming? Say I’m a kid who got an ipad for christmas. Wouldn’t I still need to own a computer to jailbreak and/or program it? In other words, doesn’t the set of users who can potentially program an ipad mostly overlap with those who can already program a computer? I don’t see how the success of ipads are evidence of a trend that the set of programmers is increasing.
This isn’t to say things won’t change in the future, I strongly hope so because otherwise today’s trends are pointing towards the rise of closed computing devices.
Assuming I’m wrong and computers are becoming more open for programmers, I’m still kind of skeptical that substantially more people will be actual programmers in the future than are actual car mechanics today. It’s not just matter of intelligence or abilities, it’s also about practicality. Like you, I think many more people could become programmers, just like they could become mechanics. But if everyone were a programmer, it would be very difficult for anyone to sell themselves as a programmer. It’s about supply and demand.
Anyways that’s just my take. I’d love to see many smaller players grow in the computing world since I am among them. We’ll see how it all pans out.
That’s why I refused to predict anything on a time scale of 5 years or even 10 years.
Again, I can only really go by the example of Admiral Grace Hopper. From her time to now, the trend has only been going one direction. It didn’t take just 5 or 10 years and at various times it looked like a pipe dream too.
Do you need to jailbreak a smartphone in order to get programs into their App Stores? I don’t understand why you are fixated on jailbreaking when it’s hardly the main way for people to write programs for those devices.
Most people won’t be programmers in much the same way most people aren’t mathematicians or authors, yet we can all read and write and do mental maths. That’s why my prediction is conservative and only goes as far as basic programming knowledge.
I would really love to see numbers on how many people today have an understanding of simple programming constructs like if-then-else, switch-case and while-do or do-until. And I’d really like to see numbers on how easy it is to teach people. That’s so basic as to be BASIC. I have an inkling that your average primary school kid can pick up BASIC much faster than kids a few decades ago.
The difference between programmers and mechanics is that it’s remarkably easier to learn programming.
I mentioned the emergence of 3D printers in one of my earlier comments. It’s still early days, but such developments, leading to “personal manufacturing” can also spur new waves of hobbyist programming. 3D printing could potentially be a much bigger draw because the results are tangible.
Unlike “programming VCRs”, programming 3D printers could be made slightly more powerful by allowing, say, a BASIC like language. Certainly, 3D modelling tools have easy scripting languages, and the ones who program them do so for artistic purposes.
3D printers are just one area. There’s just so many different and unexpected areas that probably don’t even exist yet that could contribute to the trend.
kwan_e,
“Do you need to jailbreak a smartphone in order to get programs into their App Stores? I don’t understand why you are fixated on jailbreaking when it’s hardly the main way for people to write programs for those devices.”
If you want to open up programming to amateurs, then yea jailbreaking is necessary on apple’s devices and certain androids which don’t permit sideloading due to manufacturer/carrier prohibitions, and of course windows rt. I may be fixated on it, but it’s still extremely relevant to a discussion of open programming. If we cannot program software for our own devices and our friends, then it’s not an open computing platform. Maybe in the end you don’t care much about that, but there’s no denying it raises some barriers to programming.
I’d love to see apple open up the iphone/ipad/etc and allow users to program on them like the programmable calculators we used to use in high school. Remember those? We used to write & transfer programs directly on our calculators without the need to program on a separate computer and without developer keys or app stores. I’m sure that’s too much to ask of apple, but that would certainly help open up programming.
“3D printers are just one area.”
I like the example, but these things will also be programmed by a small fraction of the population and then multiplied in the millions. Again, I don’t think it’s because millions of people could never be smart enough to do programming, but rather that there’s simply no need for millions of individual programmers when virtually everyone’s work would be overlapping with everyone else. It might be like art, lots of people would enjoy doing it, but most could not make a career out of it.
There are thousands of examples we could use, but I want to make sure my point is getting across, which is that a relatively small set of programmers are going to be working on the programming/templates/manufacturing which gets copied (commercially or freely) to everyone else. The ability to copy software is so efficient that it can scale very easily without needing to hire programmers proportionally to market size. This small set of programmers might do well with royalties, but there will absolutely be fewer programming jobs than the jobs which were replaced by automation. Can I get clarification that you agree this would be the case?
Edited 2012-12-31 09:13 UTC
But they can still program apps for the app store… Surely programming an app is more easier than programming an open device? With an app, there’s loads of help online, and you don’t have to know about internals.
And with Android, there’s an emulator.
The programmable calculators I had allowed programs to be written on the computer. Programming was a typing nightmare on them. I can’t see tablets being good for programming.
As you said previously, there’s always Raspberry Pi and Arduino etc. The point is that their is a greater lure there in the iPad/Android scene purely by force of numbers. Then there’s always XNA for Xbox.
In that sense, they don’t really have to be open to lure people’s interest – they just have to promise riches to app programmers, and they don’t even have to deliver on the promise.
Programming is a highly transferable skill so people looking to get into programming through apps and find they don’t like the walled garden approach won’t find themselves unemployable.
They wouldn’t need to. I’m not arguing for careers only. I’m talking about a potential cultural change.
I mentioned “personal” manufacturing. Things that a person may want to make for their personal use that has no use elsewhere. For example, a DIY thing. They may download plans off the internet, but to make it work for their DIY project, they may need to know some programming to modify those plans to make it work for them.
I agree that’s the case, but I don’t agree that employment is the only area where programming could become important enough to be a basic skill. I haven’t predicted precisely what areas, because some of them are probably still yet to have emerged.
And we still haven’t talked about what form programming could actually take in the future and that could provide even more unseen potential.
kwan_e,
“But they can still program apps for the app store… Surely programming an app is more easier than programming an open device? With an app, there’s loads of help online, and you don’t have to know about internals.”
Well, no, haha. When I learned years ago there were plenty of books available, today there are so many online resources that one doesn’t even need the books. There’s no lack of high level frameworks today. Programmers generally prefer open devices when given a choice about it.
“And with Android, there’s an emulator.”
Davlik is java with google’s own bytecode target, which is why they need an emulator in the first place. Anyone who picks up android programming could program in java too.
“In that sense, they don’t really have to be open to lure people’s interest – they just have to promise riches to app programmers, and they don’t even have to deliver on the promise.”
Well that’s essentially it. Even if today’s tablets aren’t technically easier to program than their desktop counterparts, they might still be providing more motivation. Interestingly I think tablet programmers on the whole make significantly less money than their desktop counterparts working for better corporate wages, but I don’t have data to back that up.
“They wouldn’t need to. I’m not arguing for careers only. I’m talking about a potential cultural change…”
When you put it that way I can agree new user accessible technologies will bring huge benefits for DIY builders and opens up a whole lot of exciting possibilities.
I’ve contemplated building my own reprap machine just to play with it.
Your idealism just doesn’t match with the reality of the last 40 years of computing.
Yes, in my generation, everyone who used a computer had to learn even the basics of programming or you simply couldn’t use the device. This might have carried on into the early 1990’s. But then it started changing.
Kids don’t need to know nor will they ever need to know computer programming. They will learn, as they do now, how to use their devices like a car or an appliance. They will learn how to download songs via iTunes 16. They will learn how to do a Power Point presentation in Office 27.
Computing is moving towards greater and greater levels of lock-down and vertical walled gardens where two major companies, Apple & Microsoft, will control the hardware & the content, oh I mean software. Linux, even though I use it and love it, is an after-thought for most people. Android is terrific and can offer a higher level of customization, however, few ever root their devices other than to simply load a game that won’t otherwise run on their older model.
As to your points, the level of basic maths is atrocious compared to previous generations at least in America. In part, this is because of technology. Slide rules gave way to calculators which have given way to computers. Cashiers rarely calculate change in their heads when their POS tells them exactly how much to give back.
The same holds true for basic literacy. As the son of two university professors of English, I definitely am aware of the changes here. No one writes letters anymore and rarely do they even do a full email. It is about texting, texting, and more texting. Have you seen the new Shakespeare transliteration done in ‘text speak’? Wow, is all I can say, just fucking wow!
Higher levels of automation lead to lower levels of intelligent & creative use of the technology. The same is true for highly specialized technology. When radios ran on tubes, more ‘users’ could and did fix and augment their devices. As radios began to be increasingly more specialized with ic’s and transitors, fewer ‘users’ could and would even attempt to fix their devices. A great example of this is looking at weaving and textile mills at the birth of the industrial age. Previously, weavers had higher levels of training and education including extensive internships or apprenticeships. They developed high levels of skill and creativity. Then things became automated. Large mills replaced the small tailors and weavers. Trying to say that the men, women, and children who ran those machines were more intelligent and as equaled skilled at weaving, sewing, or creating textiles is ridiculous. They simply were not. The same is holding true for where ever computers and computing automation have taken over.
Public education, at least in America, is trending downwards in its level of intelligence and academic challenge not only in the sciences but also the arts. I was an academically gifted high school student. I had available math classes from algebra 1 through algebra 3 & trigonometry through pre-calculus and calculus 1. By the time I reached college, I was ready for higher level math classes even if they were not a part of my major. I get college interns and graduate students at my workplace today many of whom never had beyond algebra 2 in high school. Frankly, it shows in the lower level of technical skills and critical thinking compared with those of us as their supervisors from a previous generation.
Can society change this downwards trend? Will they? If history teaches us anything, then the answer is usually no.
I posit my opening line, and your opening means you’ve completely missed the point, as my opening line continues to predict.
“IT’S GOING TO HAPPEN”, is not idealism. At best, it’s a prediction, one way or another. A prediction is not an ideal.
We can all be old men decrying the falling standards and how the past was better and everything is worse. Strange how the best times coincides with our developmental years or a short time after and everything since is the work of the devil…
“Our earth is degenerate in these latter days. There are signs that the world is speedily coming to an end. Bribery and corruption are common.”
Horseshit.
All you can provide is a tired trite response about ‘being old men’.
It just happens that my developing years as you put it coincided with the development of computers and did require much more programming knowledge in general for the user whether we went on to become bankers, professors, or computer scientists. That is not the case today for those in their developmental years. The only ones getting or even requiring that kind of knowledge are those who now solely intend to be in the field.
I didn’t miss your point.
Your point was a flawed prediction and was very much ‘idealistic’. It involves ideas that just don’t jive with the reality of the fields you were making the predictions about.
I stated that your prediction was wrong. I then produced arguments to back it up. Address those or bow out of the discussion.
No you didn’t. You said exactly what other commenters have already said, which I have already addressed, which continued to be ignored.
You have made the exact same flawed point that another commenter already has over and over again. Don’t be so up yourself to think you had an original point that I didn’t already address.
Address my points or piss off.
You are attempting to mix economic arguments with political idealism, education reform, and predictions about events 50 years into the future concerning programming.
You want to predict the future? Look at the past in that particular field or arena and conservatively estimate probabilities no more than five years out.
Otherwise you are just in a fucking fantasy world.
Obviously you were not trained well in critical thinking or in argumentation as you have not addressed any of the replies that address various flaws in your ideas, your logic, or your arguments.
I did address your points so this discussion with you is at a close. Enjoy your day.
Grow up. This argument is not that important.
You complain about me not addressing your points, but neither have you. You and the other one just throws around words like “naive” or “idealism” or “fantasy” as if they’re arguments.
CASE IN POINT:
Admiral Grace Hopper invented the concept of human readable programming languages when the top scientists at the time were convinced it was impossible.
You and the other one are making the same kind of arguments those old fogeys have. You cannot deny this. Human readable languages were “naive”, “idealistic” or “fantasy”.
History is on my side.
The article is about programming for beginners. You extrapolate that out and ‘predict’ that within 50 years we will all be programmers even when we aren’t computer scientists or IT professionals.
I and others point out that this is a bad prediction. Here’s why. You counter with political idealism. You counter with economic pipe-dreams like some Star Trekian utopia where we no longer work jobs or spend money.
Now you bring up a complete non sequitor about some admiral inventing a readable programming language. What does that have to do with the price of tea in China? Nothing. It is only relevant to computer programmers and not the average worker in other fields professional white-collar or blue collar drones.
History is not on your side in this argument about computer programming and the masses. The men and women who invented the foundational languages for programming today are dying off. My generation which grew up using these languages in order to use our computers are beginning to exit the workforce. The younger generations are not learning more and more programming languages. They are learning fewer. Only nerds, geeks, and hobbyists are playing with these languages. They aren’t being used in fields other than IT and computer science or very rarely.
I am not an IT professional. I do, however, know some programming languages and use them daily in my work and teaching. I have graduate students today who I will inform that if they are serious about doing psychological research it behooves them to learn the R programming language. Is there excitement? Have any them even learned a whit about computer programming like I did? No. The usual response is “Is there an app for that for my iPad?”
This is not going to be changing when computers have evolved to a point where it is all point and click on big shiny pads or small little gadgets with a million and one apps for everything from how many times you picked your nose today to a library for your mp3’s. My best friend and I in high school wrote our own fucking library application in Apple Pascal so we could catalog our LP collection.
Sorry, that is the reality outside of OSNews and the IT geek and hobbyist worlds with the current generations.
Uh no, READ VERY CAREFULLY:
“Basic programming will possibly be considered a basic knowledge area, just like how everyone is expected to know basic maths and have basic literacy skills today.”
Consider you and the other one are continually arguing your strawman, I see no point in rebutting your irrelevant arguments.
Uh no, I don’t “counter”. I responded to another commenter’s question with a possible scenario.
You really like arguing strawmen don’t you?
That example is very relevant in revealing that criticisms of my prediction is in the very same vein of those against Admiral Grace Hopper’s.
I even wrote “History is on my side”, the relevance of which is quite clear.
Again, arguing a strawman I did not say.
I did not say everyone will become a degreed professional. In fact, I very much argued against that.
In much the same way, compared to 50 years ago, people today know more about basic maths skills and literacy skills. Not everyone is a maths or english/language major, but the basic skills are more widespread.
Read carefully next time. Stop arguing strawman and stop being a dick about it.
Glad to see you’re basing your violent criticisms on anecdotal evidence.
And I’m the one with the poor predictions.
Compare today’s technological and educational situation to 50 years ago, and you’ll see how stupid your short-sightedness is.
You continue to prove you argument IS nothing more than old man “everything is good during my prime time, but these current generations are degenerating”.
This is why I resisted addressing your points. They’re old and I was right about the basic thrust of your argument even though you keep imagining it isn’t.
You accuse me of not reading when it is in fact you who is not reading nor understanding at all.
You quote me and then didn’t apparently read what you f–king quoted.
I said nothing anywhere about being degreed IT professionals. I am not a degreed IT professional. The men and women I work with are not IT professionals.
In point of fact, I said that basic programming skills were a part of my generation’s educational experience whether we went on and got degrees in computer science or not or whether we wanted it to be or not. Basic programming was not a f–king elective like it is today in most high schools. It was a part of my math classes. Get it? We couldn’t just go out a buy ready made applications. We, therefore, wrote them ourselves using simple or not-so-simple programming languages.
I started with Turtle Graphics Logo, evolved to Applesoft Basic, and finally learned Apple Pascal, all before I even graduated high school. All of my classmates were the same through Basic. Few went on and became computer scientists and the rest of us became doctors, lawyers, teachers, cooks, etc. Most of us didn’t continue to use it, and yet a lot of us could and did if it was relevant. In my field of behavioral economics, it is and has been an invaluable skill. I still leave much of the heavy lifting to the ‘true’ programmers but I am not an idiot when it comes to algorithms and data structures.
That is simply not the case today. It will not be the case tomorrow which is what you are arguing. My ‘anecdotal’ evidence was one example of many given of how, no, basic programming is not as ubiquitous as basic maths or basic literacy nor will it ever be. An educated person must know basic maths and how to read & write in order to function in society. Using computers no longer requires a level of programming that you believe is necessary. Furthermore, the trend is towards less and less need for it from the general population than more and more need for it as you keep attempting to predict.
In order for me to do statistical research in graduate school, I had to learn to program much of it myself. There were no apps for that type of thing. Today there are, so therefore, most students will simply purchase it and use it than design their own. UNLESS, they are actually f–king getting a degree in the IT field.
And will you drop the damned red herring and your straw men about me saying, when I haven’t, that somehow my generation is special and that it is just about age.
Seriously, I love having a powerful quad core, multiple gigabytes of ram, and terabytes of hard drive space. No, wait, everything was so much better when I was kid. Let’s bring back 1mhz, 512k, and floppies. I am of mixed thinking whether basic programming should be required today. It was necessary when I was being educated because of where the computing technology was. It was certainly helpful in my field and would be to my students and interns. Basic programming, if done right, and not just as a sale-technique for the latest and greatest commercial programming language, can teach logic, critical thinking, and structured planning. Students do not get a lot of that training in most of their education. But in general, no, basic programming is not as relevant today as it was in the past.
You are the one idealizing a past where it was necessary and so you believe it should be today right up there with basic literacy and basic maths.
It won’t be ever again. It is a bad prediction that won’t be born out by history in this arena.
Edited 2012-12-30 05:07 UTC
Do you deny that a lot of menial jobs are being automated or outsourced?
Never made that argument. Read again.
Your quote:
“Basic programming will possibly be considered a basic knowledge area, just like how everyone is expected to know basic maths and have basic literacy skills today.”
The basic premise of your argument as you wrote it.
If you meant otherwise, then state it again more clearly with different wording instead of the tiresome “You didn’t read my post” when clearly I did. Thanks.
Your quote:
“Basic programming will possibly be considered a basic knowledge area, just like how everyone is expected to know basic maths and have basic literacy skills today.”
The basic premise of your argument as you wrote it.
If you meant otherwise, then state it again more clearly with different wording instead of the tiresome “You didn’t read my post” when clearly I did. Thanks. [/q]
Not it’s not the basic premise my argument at all.
How does that statement alone idealize the past?
Where do I say “it should”?
These are the things you read into that one statement, not actually made by me.
Stop embarrassing yourself with the strawmen and comprehension ineptitude. Or maybe you’re just psychotic, inventing thoughts for people and then argue with them as though they made your points.
You clearly did not read my argument. You read your own argument into mine. There’s a difference.
If you want “idealization of the past”, this is it:
Oh wait a minute, YOU wrote that. Silly me!
I find it precious that you’re trying to argue that more people in the past knew how to program than today.
That is just beyond idiotic.
And there you were, in a previous comment telling me to get out of this OSNews mindset, when YOU are the one basing your cynicism on your own narrow view. Here’s a hint, when you were in school and “had” to write your own programs, the other people in your school probably didn’t have their own computers.
You’re the one idealizing the past and than projecting it onto my prediction of possibilities.
Did you say somewhere you work in psychology? How about you psychoanalyse yourself and figure out how you missed the obvious fact you’ve projected yourself onto me and arguing against that projection? You may have serious issues with self hate.
Ah, the last resort of a feeble mind – the ad hominen attack at someone you don’t even know.
Your reading comprehension is appalling.
You dream of a future where, as part of a standard preparatory education, computer programming will be as common as literacy and mathematics have become. You have stated that over and over in numerous posts.
I pointed out that the closest to that reality was in the infancy of home computing because during that phase of development to accomplish most things on the computer one had to know at least the rudiments of programming. There were no gui’s. There were no app stores with a thousand ready made applications waiting to be downloaded.
So yes during that time in most public schools computer classes, which in my large cities public high schools was a part of the math department, involved actual programming. Yeah, maybe only Logo or Basic, but programming nonetheless. No, not everyone had a fucking computer in their homes. (You do love those non sequitors, don’t you?!) No, not everyone went on to become computer scientists or aspired to work in the IT field.
By the late 1990’s that had changed. Computer classes today in most public schools do not involve programming. They involve how to use productivity applications like office, multimedia applications like Garage Band, etc. Kids get iPads now for eBook reading instead of carrying around a backpack full of heavy textbooks. Yes, there are programming classes but, in general, not for the entire school. They are there as electives for the kids that want to and will go on to college or tech school in order to enter the IT field.
This trend is here to stay and frankly will get worse. Computers no longer need the general user to know anything about what is actually happening when they turn on the device. Single user devices like iPads with single application at a time usage patterns and automatic saving of files for use in other applications requires no thinking from the user. It has become like driving a car or using a microwave. Learn a few basic steps, and then you are fine.
I am not idealizing the past by simply pointing out, as you apparently were not there, that the time of computer programming being a requirement within general education has come and gone due to the evolution of the industry and marketing behind current computer usage for the masses.
I was laughing at the fact that what you dream about has already come and gone. It will not likely be back in vogue any time soon due to current realities.
I know you will simply not get it. That’s fine. You will grow up someday presumably. While it has been ‘interesting’ debating with you, I am quite thoroughly done with this conversation with you.
Says the man making the ad hominem.
Prediction != dream
Where did I say standard preparatory education? I said basic knowledge of basic programming. I didn’t say how or where it’s taught. What was that you said about reading comprehension?
Bullshit. This never happened. In your own words, step outside of your own little world. Computers were not common, and you were in a very small group of privileged people, and you make it seem as if a sizeable percentage of the population had to write their own programs.
These days, any middle school kid can learn programming on their own through internet tutorials. And they do this today in greater numbers than when you did it, old man.
Thank you for making my point for me. You say I am having a “dream” about increased programming literacy. You basically are admitting that NOW compared to THEM, the trend of programming literacy is increasing.
They’re only a non-sequitur to you because you are a moron who can’t piece information together. Here’s a hint:
COMPUTERS HAVE BECOME MORE COMMON PLACE. PROGRAMMING TOOLS HAVE BECOME MORE COMMON PLACE. PROGRAMMING TUTORIALS HAVE BECOME MORE COMMON PLACE. PROGRAMMING HAS BECOME EASIER.
The trend has been up.
Again with the schools. Keep banging on about that old man. You only show your lack of lateral thinking – ABOUT THINGS THAT ARE ALREADY HAPPENING. People are self teaching outside of schools and universities.
As menial jobs become automated, the only jobs that will be left are computerized ones that can’t be cheaply automated.
Programming was never a general education requirement. Not everyone went to your fancy ass rich school, and not everyone in your school learnt programming. Stop dreaming that you were somehow in the norm, you over privileged jackass.
Come and gone?
More people today know about programming than when you were in school. The trend is an increasing one. This is a fact.
“Current realities” are complete bullshit. Can you get it through your moronic skull that “current realities” does not make a good predictor of tech trends. Can you get it through your moronic skull that naysayers have been wrong and you are making the same old arguments that the naysayers of old have said?
You definitely are a rude, arrogant, and unintelligent little prick, aren’t you?
Ah, if only I grew up with the privilege you imagine I had. Nope, sorry, just a very smart kid whose parents taught college English in a small town in the rural south of the US of A. Thanks to a friend in the computer science department, I got to get an Apple II which was used by everyone in my family. I wasn’t the only one. In fact, most of the kids in my very normal public high school had computers – Tandy’s, Amiga’s, Commodore’s, TI99’s, Apple’s, etc. We weren’t as backwards as you might think, and yeah, computers really were common even then. No, not like today where 5 year old’s have iPads, and every college student carries a mini-computer as a phone in the back pocket, but yeah, we still had computers. Sorry to bring you back to reality, kid.
Yes, any kid today COULD learn programming. But, listen, outside of your fucking little IT bubble world, most don’t. Sorry, they just don’t. They love having iPhones. They love iTunes. They love playing Angry Birds. But no, they really don’t do a whole hell of a lot of programming – even basic shit.
I get paid to make predictions. Check back with me in five years, and we’ll see whose predictions are more accurate.
Yeah, you have a problem with comprehension. I imagined you “grew up with privilege”, and somehow I think you were “backwards”?
Jesus Christ you’re a moron.
No they weren’t. Sorry, but you’re extrapolating from your bubble world. Even if everyone supposedly had computers, an even smaller percentage bothered to write their own programs.
Sorry, it just wasn’t as widespread. Not everyone had the same childhood as yours.
Your the one living in a bubble world. I can assure you less people learned how to program back then compared to today.
Sorry, but you’re old man “things were better in my day” cynicism does not count for evidence. Just because you can’t see the actual evidence, remaining wilfully ignorant, does not mean people aren’t doing it.
I specifically mentioned a timeframe OVER 50 years.
You’re an arrogant and unintelligent little prick, aren’t you?
Please learn the difference between 5 and 50. Maybe you’ll realize that your biased observation sample is anecdotal evidence, no matter how much you wish it weren’t. A person who doesn’t understand the basic number line isn’t going to understand statistics.
Listen you little shit. You think you are such a smart young man. “Look at my predictions.” I am unique in what I predict, and if anyone disagrees, then fuck them, they are just old, stupid, etc.
Well, your fucking little idea is not original. In fact, it sounds an awful lot like the plagiarism of a DARPA grant proposal done in collaboration between the Corporation for National Research Initiatives (CNRI) & the Python group called CP4E (Computer Programming for Everyone.) It’s basic premise is best shared as quotes:
We ask a follow-up question: “What will happen if users can program their own computer?” We’re looking forward to a future where every computer user will be able to “open the hood” of their computer and make improvements to the applications inside. We believe that this will eventually change the nature of software and software development tools fundamentally.
We compare mass ability to read and write software with mass literacy, and predict equally pervasive changes to society. Hardware is now sufficiently fast and cheap to make mass computer education possible: the next big change will happen when most computer users have the knowledge and power to create and modify software.The open source movement claims that peer review of software by thousands can greatly improve the quality of software. The success of Linux shows the value of this claim. We believe that the next step, having millions (or billions) of programmers, will cause a change of a different quality–the abundant availability of personalized software.
The tools needed for this new way to look at programming will be different from the tools currently available to professional programmers. We intend to greatly improve both the training material and the development tools available. For example, non-professional programmers should not have to fear that a small mistake might destroy their work or render their computer unusable. They also need better tools to help them understand the structure of a program, whether explicit or implied in the source code.
Our plan has three components:
Develop a new computing curriculum suitable for high school and college students.
Create better, easier to use tools for program development and analysis.
Build a user community around all of the above, encouraging feedback and self-help.
These components come together in the scientific exploration of the role of programming in next generation computing environments.
We intend to start with Python, a language designed for rapid development. We believe that Python makes a great first language to learn: Unlike languages designed specifically for beginners, Python is also the choice of many programming professionals. It has an active, growing user community which has already expressed much interest in this proposal, and we expect that this will be a fertile first deployment ground for the teaching materials and tools we propose to create. During the course of the research we will evaluate Python and propose improvements or alternatives.
http://www.python.org/doc/essays/cp4e.html
Yup, that sounds pretty much word for word what you have been sharing with us here in this discussion, doesn’t it?
Now, comes the fucking reality check. The reason I chose the time frame of 5 years is because that is a standard time frame for grant models involving things just like this. So, yeah, real world shit instead of your ‘ideas’ about statistics.
So how well did this program do? Turns out, not so well. Funding was received for one year. Then the people moved on. The program was never renewed and as it states on the Python.org archive page for this now defunct project, “It is in limbo”. The Python in Education Special Interest Group is a dead link. The CNRI have moved on with no initiatives involving anything like this. DARPA has nothing either.
So, yeah, this isn’t going to happen. I don’t need even five years to predict this. I can look at an initiative from over 13 years ago that failed before it even reached its five year goal mark. You want something in this current climate to become ‘mainstream’, a part of society? It is done in collaboration between the government, academia, and then private corporations. You do know the history of the internet, right?
Well, government right now in America is cutting funding and social services. It is all about ‘austerity’ and balancing the budget after we over-spent on two fucking wars. Academia is becoming less liberal and idealistic and more conservative and pragmatic. Most institutions, including the ones I teach at, are now partnering with corporations to increase their funding which dictates in part what is being researched and taught. Finally corporations want more ‘users’ than ‘developers’. Users buy hardware and software constantly. Developers are now being locked into ‘proprietary’ off-shoots of open programming languages and being driven to constantly upgrade to the next development model being driven by the user’s consumerism. But, hey, you know this too, because you work in IT, right?
We all live in bubbles. Mine is just a lot bigger. Between a private consulting business, work at various corporations and government agencies, and teaching, I have experienced and learned a lot more. That is one of the nice things about being ‘older’. We have the experience and knowledge to actually start getting it.
For now, kid, you just don’t get shit. Hopefully, you will out-grow it and that you are not representative of your generation.
Never said my predictions were “unique”.
You haven’t actually touched any of my points at all, but continue to argue strawmen that you’ve projected onto my comments.
Never said it was, dumbass.
I never said anything about school programs. I keep talking about things HAPPENING OUTSIDE OF SCHOOL. DO YOU HAVE GLASSES OR DO I HAVE TO KEEP WRITING IN ALL CAPS?
No, NOT word for word at all. And some of the “plan” actually DOES EXIST RIGHT NOW. Fuck you’re an idiot.
Then you are arguing a strawman. My prediction is about the next 50 years. Argue my prediction, or get the fuck out.
And this is why no student has ever picked up Python again… oh wait, no it didn’t.
You’re still stuck on “official programs”. What’s happening now is pretty much not done by any single organizational effort. You never heard of “grass roots”?
You keep saying “want” or “dream”. I’m saying it “will” happen whether anyone wants or not. Stop trying to put words in my mouth you retard.
Because EVERYTHING in IT has happened purely in corporate settings, and corporate settings have been oh so successful, right?
Explain how more people today can drop out of university and manage a successful IT based startup and why your bubble doesn’t account for that. Your narrow field of view obviously never incorporates anything that happens outside of it.
I never said I was. THAT WAS YOU. You’re claiming everyone in America was like you. Everyone learned how to program because everyone had a computer. THAT WAS YOU.
Here’s an easy test. How many people of your age either has a programming job or some hobby involving programming at some time in their life? I can tell you it wasn’t 100% of the American population at your age. I can tell you it’s probably not even close to 10%.
You are fucking delusional. Prescribe yourself some pills for your psychosis.
This is actually become quite fascinating the way you move the goalposts, fail to use logic, and aren’t even apparently aware of your own premises.
So, how exactly did the general population become literate and maths capable? Was it a grass-roots effort? Read a history book for the answer, smart ass.
That’s why it isn’t a straw-man to point out that your premise requires formal education in order to be even remotely accurate as a predication whether in 5 or 50 years, and I provided you several excellent rebuttal as to why that isn’t going to happen. They included failed attempts like the CP4E program over 13 years ago in order to implement such an education program (the only part of that program still in existence is the IDLE – look it up on the fucking webpage!); the changing technology since when I was a kid when more ‘programming’ knowledge was needed to use the damned things (not that everyone had them or programmed them to the level of literacy just that there was more of a need and a greater thrust in the public education systems to ‘learn’ about computers not just ‘use’ computers like today); and the current corporate business model like Apple and Microsoft’s walled-gardens, vertically integrated hardware & software solutions, transparent & easy to use ‘computing devices’ and run-away consumerism.
Sure Python is a very popular programming language, but that doesn’t mean that it is being used in enough schools where this ‘programming literacy’ will have to be implemented in order to become wide-spread as you predict in 50 years. Populations became literate due to institutions like the state and the church.
And yes, it is fucking idealistic and delusional I might add, to imagine that some grass-roots effort involving websites like the Khan Academy online are going to some how make populations computer programming literate in the next 50 years. You and I enjoy programming even if only one of us is actually in the field itself.
How many of your friends and family are not in the field? How many of them program? How many actually turn to you for assistance on usage as opposed to figuring out what is going on behinds the scenes? How many of your neighbors? How many of their kids? Have you got kids yourself? Have you got any idea what is happening with children in the public educational system?
You may be too young to remember ‘shop classes’. They were a regular part of public high school education in America. They are no longer. Why? One reason is the thrust of simply getting schools tested and into colleges. Another reason is that most appliances, cars, etc. are not worked on daily by the general population in the way that they were 50 years ago. If I can’t program an embedded device how to I fix my new Kia? Microwaves are thrown out, not kept and fixed. We pay a repairman to come fix the complicated computer-based dishwashers. The same has become true for computers and programming. It is simply not necessary to use the damned things any more. I can play my games, purchase my apps, send texts, surf for porn, and never need to know what the fuck the device is doing, how it is doing it, and how do make it do things I want it to do.
http://www.forbes.com/sites/tarabrown/2012/05/30/the-death-of-shop-…
So who is putting words into the other’s mouth. I never said it was 100%. Look up this little statistic dumb ass. How many women in the 1970’s and 1980’s became computer programmers? Compare and contrast that to how many young women in 1990’s and 2000’s became computer programmers. Extrapolate from there some possible reasons and causes. That might answer a few of your oppositions to some of my comments.
Just because you believe it will happen, doesn’t mean it will. Which, I might, is a sure sign of psychosis. So your response I am sure will be pithy, entertaining, and delusional like all of your other ones have been.
Have at it. This is quite entertaining now watching you dig yourself deeper and deeper in the bullshit.
Doesn’t mean programming has to follow the same path. You have a very linear mind.
I raise maths and literacy as an example of widespread basic understanding, and somehow you extrapolate it to mean “that’s the only way anything else has to happen”.
You didn’t. Really.
Your only “rebuttal” is to label my predictions “dreams” and “wants” and “idealism”, and none of your anecdotal evidence actually addresses what I’m discussing. Labelling does not constitute an argument.
Maybe you should read my conversation with Alfman if you want to know what a proper discussion looks like.
I’m having a similar, but much more informed conversation with Alfman on this issue.
Schools, huh?
You know, a lot of my friends are in the IT world. I say a higher proportion of the people in my age range gets a job involving IT, whether it’s programming, administration, or engineering that relies on computer simulations than they did in your time.
I know that’s anecdotal evidence, but you don’t seem to mind.
There’s a bifurcation happening for sure. Consumer products are closed off. But more complicated stuff requires programmable computers. Given the increasing tide of automation and outsourcing, the only jobs left in 50 years will be computing jobs. Menial and manufacturing jobs can be automated, but the task of automation requires human intellect.
Of course, I’ll never expect you to look at the big picture as a system.
No, but you made it out as if everyone in your age range owned computers and had to write programs. You never said 100%, but you clearly implied it was the norm. It never was the norm. It was the norm amongst a small population, hence it is not the norm over the actual population.
How many PEOPLE in the 70s and 80s were programmers? How many PEOPLE today are programmers?
This cherrypicking of data of yours is every telling of your mindset. And you have the nerve of saying I raised irrelevant points.
If you don’t mind, I’ll just continue my discussion with Alfman. He actually doesn’t use any strawmen and has powers of comprehension. You can rant all you like.
But that’s how progress works, not “downward trend” – we devise new ways to augment our bodies, including new prostheses of the mind (like books replacing memorisation of everything). So that some of us can move on to new challenges.
Programming for all is a good concept. Inclusion is good, exclusion is bad.
If you want to talk about the future, talk about whether the march of great mature open source software will kill the need for employing so many programmers. If that happens, what next? When good software is unchanging and ubiquitous like the microwaves and toaster oven designs that seem unchanged in ten years, will the new goal of programming be… to make programming different so new types of people can participate? Will everyone use the same twitter client for 50 years like they and their parents ate Cheerios? Or will it be like fashion, and the next great thing for the individual isn’t just done by a professional somewhere, but also by little girls after Christmas, thanks to the bedazzler or nail polish patterns someone kindly invented so those other than the professionals could participate in creation too?
Just sayin
It probably won’t. Not because of great mature open source software, anyway. Great mature open source software, at least today, seems to give great business opportunities and seems to have resulted in more people being employed to program than it is driving people out of it.
Yes. People seemed to miss my point on that. For whatever reason (possibly lack of foresight and imagination), they seem to think I’m arguing that people in the future would need to learn programming at the systems level as a basic requirement.
We’ve seen how the SGML based and HTML-like languages had changed the accessibility of the average user to program computers. Yes, it resulted in Geocities homepages for pets in the beginning, but that would be a ridiculous counter-argument. At the very least, you can’t stop people from making what they want.
No. People haven’t even used the same social networking sites for 10 years.
Yes. It only seemed like a short time ago that even knowing how to use computers was considered nerdy and uncool. Now even your average bimbo has an iPad.
Of course, we also need to remember Admiral Grace Hopper. The computer scientists at the time not only thought that compilers were impossible, they thought opening up computers to a greater audience would RUIN EVERYTHING. What they didn’t anticipate was that computers and their programming models themselves changed to adapt to people.
It’s funny to see the same macho types today trying to fence off programming for the “elites”, not knowing they are just repeating history with their short-sightedness and pretend-OCD.
Nobody is ever “just sayin”. If they were, they wouldn’t need to end with it.
Just breathin