You are not logged in.
You sent an emacs link to a vim guy
But, I'm open minded enough to try it one of these days when I get around to it.
What's your day job if you don't mind me asking?
Offline
Getting back to the OP, I recently came across this blog reflecting on the possible death of the keyboard due to advances in touch screen technology and the conditioning of the next generation towards graphical/touch interfaces (and voice input).
Having recently purchased a tablet, I do find that there are some tasks that feel more natural using gestures or stylus input: drawing, selecting hyperlinks, zooming, panning, etc. But many of these tasks turn out to be things that I would normally use a mouse for--not a keyboard. So, perhaps touch interfaces will be the death of the mouse--but not the keyboard? Perhaps, the keyboard concept will remain but in a more virtual form? In which case, perhaps programming languages will retain their resemblance to the written word instead of evolving into a more abstract symbolic form;e.g. something resembling mathematics or a diagram.
Last edited by bsilbaugh (2012-04-22 01:55:25)
- Good judgement comes from experience; experience comes from bad judgement. -- Mark Twain
- There's a remedy for everything but death. -- The wise fool, Sancho Panza
- The purpose of a system is what it does. -- Anthony Stafford Beer
Offline
Having recently purchased a tablet, I do find that there are some tasks that feel more natural using gestures or stylus input: drawing, selecting hyperlinks, zooming, panning, etc. But many of these tasks turn out to be things that I would normally use a mouse for--not a keyboard. So, perhaps touch interfaces will be the death of the mouse--but not the keyboard? Perhaps, the keyboard concept will remain but in a more virtual form? In which case, perhaps programming languages will retain their resemblance to the written word instead of evolving into a more abstract symbolic form;e.g. something resembling mathematics or a diagram.
I don't think programming languages will ever become GUI oriented because it simply isn't flexible enough. Take a look at LaTeX for example, typesetting can be automated by a WYSIWYG program but the control and precision that a markup language gives makes it prevalent. It is far less efficient to drag components and arrange them on screen than to type it out. I could possibly see a future where the keyboard becomes obsolete and input is by accurate voice control or stylus+touchscreen but it will take a leap in technology to make it cheap enough for adoption. As for a language that resembles mathematics, I suggest taking a look at Haskell.
Last edited by Daedalus1 (2012-04-22 07:29:25)
Offline
It is very unlikely there is going to be a single programming language of the future, as there are many domains that each have different objectives. Does the software need to be real-time, does it need to be maintained, does it need to execute on a specific architecture, does it need to be fault tolerant or safety critical?
Software has evolved from op-codes, assembler code to high level code that typically runs within the constraints of an operating system. These all rely on the original op-codes but we now write software at a much higher level. So the question is what is the next level of abstraction? Would it be possible to capture the advantages of diagrams and code in a single language - diagrams for high level concepts and code for the detail?
Maybe the compiler could define an optimal threading strategy for multiple processor cores.
Last edited by zorro (2012-04-22 22:16:01)
Offline
I've wanted to post a link to this video since this thread started. Behold... the futore!
Offline
Very interesting, thank you.
- Good judgement comes from experience; experience comes from bad judgement. -- Mark Twain
- There's a remedy for everything but death. -- The wise fool, Sancho Panza
- The purpose of a system is what it does. -- Anthony Stafford Beer
Offline
You sent an emacs link to a vim guy
Nobody's perfect.
What's your day job if you don't mind me asking?
I work here: http://www.quest.com/authentication-services/
If I were to ask you a hypothetical question, what would you want it to be about?
Offline
Maybe the compiler could define an optimal threading strategy for multiple processor cores.
This seems like the low-hanging fruit. I imagine that a compiler that does automatic multiprocessing reasonably well would be a huge benefit in terms of the ratio of run speed to programmer effort.
Laptop: Arch+i3 Personal file sever: Slackware Work PC: Fedora+wmii
Work file sever: FreeBSD Work application server: Scientific Linux
I'm an addict.
Offline
There has been an increasing trend on research on processing of unstructured language by computers. The best example of which yet is the IBM Watson supercomputer. If this technology matures (and it will) probably compatible compilers would be produced which will understand what the programmer wants to achieve and create the binary. Input to the compiler can be through any medium - text or voice. But the difference is that there'll probably be no necessity to follow syntaxes. So no question of syntactical errors. Semantic errors could be handled by the compiler itself when it understands the purpose of the program or a program segment.
For an example, touch screen technology had its invention in 1970's and within 1990s it came in hand-held devices and has commercial production and a wide range of applications now. So it can be considered mature. Likewise this technology will take sometime before being accessible to people other than computer scientists and to the level of being implemented in production environments. But it will.
Offline
Seems I'm a bit late to the party but I'll chuck in my $0.02.
It occurs to me that human beings are, generally speaking, wired to use language to express ideas, and there are at least a few studies showing that the ideas we can conceive are to some extent limited by the language we use to express ourselves. So I don't think programming languages are ever going to die in favor of graphical abominations like LabVIEW. At the same time, LabVIEW isn't going anywhere either, because it's great for what it's used for: rapid prototyping and automated testing by technical non-programmers.
I dream of a day when keyboards, mice, and monitors are a thing of the past -- for users. In my vision of the future, the computer is an artificial intelligence, like a family member or intelligent pet -- something like the "family" in Fahrenheit 451 crossed with Tony Stark's valet in Iron Man and the offspring of Hal 9000 with the ship's computer from Star Trek (minus any creepy homicidal or brainwashing tendencies). For users. Why? Because it makes the technician's job WAY easier. How many everyday problems could we solve if we could divorce computer interactions from any kind of I/O device? "I saved my resume on a USB and now Word won't open it!" "I opened Microsoft and now my desktop is frozen!" "AOL won't load!" "My cupholder is broken!" "My mouse won't move!" "There's a strange icon on my desktop, could it be a virus?" Giving end users buttons to press and icons to click and error messages to misinterpret is just asking for this kind of thing, because for everyone who can figure stuff out on his own there are two dozen who only know enough to get themselves in trouble. (Which is why Apple deliberately limits their software in terms of what you can do that isn't bleedingly obvious.)
The highest level languages, well, I don't doubt that they will change -- and in fact, the highest level language could well be English. "At 7 a.m. tomorrow, turn on the coffeepot and set a timer for 10 minutes, then wake me when the timer goes off." But all this AI and the underlying code, right down to the firmware, will still need to be done in a relatively ordinary programming language. Because English is too redundant, too ambiguous, and too verbose to use for circuit design or logic. Perhaps we'll invent perfectly unambiguous spoken languages like Loglan from... oh, what was that book? Luna maybe? But that would still be a very high level language, no replacement for the likes of VHDL, assembly, or C. Those three aren't going to disappear.
Offline