Can anyone define "succeed" in this context? Is it purely popularity? What if a language is inferior but popular?
If a language is effective for its user(s), then is it a "success"?
Is there some minimum number of users that delineates a threshhold for "success/failure"?
What about DSL's that might have a limited number of users?
I like that the author recognized the "success" of shell and C as tied to the "success" of UNIX . But he forgot others, such as sed.
Some my favorite and most powerful languages are not widely used. I have no idea what "succeed" means to others in the context of programming languages, but these languages have "succeeded" for me. They get the job done. Efficiently.
Nothing has surpassed Snobol for pattern matching. (If you are in doubt, post a pattern matching challenge and let's see how the solutions in various languages stack up.)
Meanwhile other implementations of pattern matching have become more popular (=successful?)
Perhaps the word "succeeded" here simply means "succeeded in becoming popular?" If so, then I apologize for the gibberish.
aka web-dev-world. Where the lowest common denominator has more foot-guns than C. Worse, the footgun often won't go off directly when you pull the trigger, but some time later, perhaps when it is aimed at your face.
Whenever this gets posted, comments seem to focus on what to use going forward. It seems obvious for most projects there are better alternatives.
However whenever I see autotools documentation I think of only one thing: it's worth learning how this old system works because, even if you will never yourself use it for your own projects, so much code written by others over past decades requires autoconf, automake, often libtool, and sometimes pkg-config.
For me, understanding how these old hacks work is very important in getting a large majority of open source software projects to compile after I make modifications, e.g., removing code.
Oftentimes I think that people who use autotools do not truly understand _how it works_, they have only figured out _how to use it_. This is reasonable but the problem is that autotools is very brittle, and if it breaks they have little idea how to fix it.
To the author: I could not agree more. In the long term history of computing, I would hope this stage we are in now is not the height of the "revolution".
In my humble opinion, to which I am entitled, current Apple hardware is still well-designed like the Apple hardware of the past, but none of it resembles a "bicycyle for the mind".
These phones and tablets are "computers" but are programmable only by permission; they are consumption instruments that are meant to support some plan to dominate the communications, media, entertainment industries. Not my idea of a programmable, pocket-sized, networked computer.
All due respect to Apple and their wild commercial success, but looking to the future, I get more excited about my RPi or Teensy than I do about my Apple devices.
I have little interest in paying for a license to a bloated, complex, proprietary IDE (Xcode) and seeking approval from an "app store" when I can write ARM assembly from a netbook or laptop using a free and open source assembler and run it instantly on the RPi.
The revolution is yet to come. I hope. kparc.com/o.htm
You can code on iOS devices. You can code in many different programming languages on IDEs available directly on the device. The code can be executed, and so programming is not inhibited.
What you are referring to is deployment. You want to be able to deploy or distribute your programs freely through the official channels. And because that is locked down, you consider it "programmable by permission." I would argue that this is not the case.
All I care about is the logic and elegance of programming. I don't care how my program runs — whether I write ARM code that is simulated through some App Store app. Whether I write Lua code that is run through an iOS game engine, or whether I deploy it directly to the hardware. That is immaterial because I still get to enjoy the art of programming.
Apple's phones and tablets are programmable computers and they can be programmed through officially and unofficially distributed apps. Both free and paid, open and closed source. Just because the official distribution model doesn't suit your personal preference does not make these devices any less programmable computers.
Also note that Xcode is free and you can freely deploy apps from Xcode to your devices. So I am not sure I understand your criticism here. Nor do you need to even use the closed-source IDE (Xcode) when the compiler and language are open source.
The tablet point is a red herring. 8-bit micros were hardly portable. So discussions about whether or not you can code on iOS or OS X are tangential.
The real issue is how easily you can code. 8-bit micros hit the sweet spot. No system available today comes close.
You powered up the machine, and the first thing you saw was a BASIC line editor. There was nothing else to distract you. It was instant-on with no setup.
You had to write code to use the machine at all. You even had to write code - albeit one line - to load a game from a tape.
For the gifted, BASIC led naturally to machine code and to graphics made by writing bytes into memory.
No modern environment has anything like the same simplicity, directness, or sense of natural progression.
Xcode, gcc, anything with a build system (never mind a package manager) are insanely complicated in comparison. They're so complicated professionals have to write books explaining them to other professionals.
Even Python - possibly the best candidate for a successor to BASIC - has a quirky IDE and two and a half different popular versions, and a lot of other complications that a BASIC cursor doesn't.
JavaScript? You really have to learn CSS and HTML and jQuery and $(infinitely long list of frameworks goes here) and - oh look, is that the time?
There is a huge difference between encouraging programming by making access to it friction-free and trivially easy, with a learning space that's comprehensibly small but not dumbed-down and toy-like, and making programming possible for users who don't mind hurdling a lot of obstacles.
That first category is completely empty today. It shouldn't be, but it is.
I often get told that my iOS coding app takes people back to their BASIC days. Most people who use it end up telling me they are strongly reminded of the magical feeling when they first discovered programming in their youth.
Kids and schools who use my programming app seem to enjoy the same feelings, but for the first time instead.
I don't believe that it was ever easier to code than it is today. There are myriads of ways to discover the magic of programming. Coding isn't "for the gifted," it's for everyone. Just because the gifted were the only ones who got past the opaque interface of an empty BASIC editor, doesn't mean we should go back to those days.
Your reminiscence for the early days exists right now in beautiful recreations of those environments by the people who loved them [1].
I have been developing a friction-free programming environment for over three years now. And it's not dumbed-down, or toy like. It's instantly on, and you're immediately inside the most interesting place to code, and you instantly run your programs.
I code with my three year old. We do logic and programming games, play with visual programming, and so many other things that never existed when I was a kid. It will only get better from now.
You and I live at a time where every new day is the best day ever to learn to code.
Yeah, complexity and that we hide the carrots. Bear with me.
I still remember a point in my early teens when I decided I wanted to learn to program. I ran into a wall so hard I stopped for a good year or two.
The complexity is a considerable cliff unless you have a guiding hand (thanks dad for throwing me a copy of K&R and going off to watch startrek!), and if you didn't, you're not getting to Python around today.
First there's the "What language do I learn" search query that turns up so much confusing information. "Depends on what you want to do with it". Oh gee thanks, but I don't know what I want to do with it. I just see that other people are doing cool things with it, so I want to know how they do it.
So you dabble in Python, C, Java, whatever has install instructions you can follow, because the moment a terminal prompt pops out you're scared that you'll screw up the family magic box and get into a lot of trouble. At this point curltarmake sounds like a Polynesian dish to you. Ack.
But you never actually do something useful. Thank you helpful online tutorial, I now know how to print "Hello World" 100 times without copy and pasting it (though that'd have been faster). There's carrots for sure, but you never see them at that point. So you're like "eh, this tutorial is silly, let me try another of these 10 languages suggested".
And then later on, in high school you may have the considerable fortune to get to take a programming class that walks you through Java out of all of the things. All these OO concepts make little sense to you, but eh, you'll be tested on this. The code is still onerous and useless, but you signed up for the class. Towards the end, you'll slowly get an idea of useful things you can do with it (finally! after all these years!). Most of all, you'll develop a distaste of Java.
The wonders of childhood. I honestly believe Assembly is much more straightforward than trying to grasp the current programming ecosystem as an outsider or a kid. There's so much stuff to become acclimatized to.
So recently I've been working with a guy who wants go get into programming, did a bit of HTML&CSS&JS&Whatnot half a year ago, but he was stuck in that "I don't know where I'm going" stage. First thing I recommended him was "Automate the Boring Stuff with Python", which is a book I absolutely adore, because it first walks you through the basics (including setup), and then immediately gives you one library after another to string scripts together with.
Yes, the code may not be teaching the best conventions or habits, it never delves into OO or version control systems, but then it's interesting hubris to think beginners care about that before they got any utility out of the skill.
And he enjoyed it, learned a bunch of things, started writing his on scripts and is now getting into Django and reading through the python cookbook.
So yes, I think we're missing the mark on showing kids and newcomers how to program, in part because it's gotten more and more complex, and in part because we give them keys to doors they don't even know about, whilst withholding those to doors they care about. We've got a major sequencing problem here.
Don't be silly- Apple gives Xcode away for free, it is not bloated nor complex, nor particularly proprietary (who do you think made LLVM?) and you don't need to seek anyone's approval to upload your software to github or to run it on your own device. You can write Arm assembly on your laptop and run it on your iPhone even easier than you can on an rPi.
You are really wasting time hating on Apple and spinning them (i you really are an apple device owner) since Apple is the one who broke open the mobile device so you could run software with out permission (Before you had to get AT&T for Verizons permission to put code on your phone) and who made high quality development tools and platforms available for free.
Apple is the one who shipped the Apple 1 with BASIC... and they haven't stopped.
But that argument simply books down to "more expensive computers cost more". True a Linux computer might cost $500 and got $0 more you can program on it. A Mac might cost $800 and for $0 more you can program on it. The marginal cost to develop is the same. But I could easily spec up a Linux workstation costing thousands of dollars. Macs being expensive doesn't make Linux computers more expensive, it doesn't raise he costs for anybody else. So who exactly is being harmed here? I just don't really see what useful point is being made.
If we follow your argument to the logical extreme there is no such thing as free software because the computer to run such software almost always costs money?
(Unless you receive a computer as a gift or it is freely leased; but then that applies to Macs as much as any computer.)
The GNU toolchain (or whatever) is licensed to run on any hardware that you can possibly use it on, which will often be cheaper and more accessible than a Mac. Xcode requires Mac OS, which is only licensed for use on Apple hardware.
You can buy used Apple hardware fairly cheaply, but I think that post still has a point about relative cost.
Maybe nobody will see this, but consider this thought experiment: You have a replicator like from Star Trek, which can make any computer that exists now, at zero cost (say it uses garbage you were going to throw away.) Presumably if you copied an Apple machine, it would not legally be a true licensed Apple machine, for the same reason that making unauthorized byte-for-byte copies of digital files does not create additional licenses to use them however you'd like. If Xcode's license requires licensed Apple hardware, it can't legally be run on that machine. On the other hand, FLOSS compilers could run on that, or on a free/libre hardware design that is not even in a grey area to replicate.
While the analogy is maybe a bit silly or different than what was originally said (interpreting "free of cost" more like "libre"), I think it illustrates a real difference that is relevant.
(Btw, I don't actually know Xcode's license -- maybe it only needs Mac OS for technical reasons, not licensing reasons, in which case it could legally and practically be run on any sufficiently correct Mac OS emulator.)
That's not his argument. His argument is that an apple computer is required for Xcode. An apple computer is a regular computer except that it costs more and comes with a bundle of software. Xcode is in the bundle. So... The price is indeterminable but definitely not $0. He hedged specifically against your reductio ad absurdem by mentioning the price difference.
The argument was made against the statement "Apple gives Xcode away for free."
They do give Xcode away for free. I didn't think that factoring a modern development computer into the price was a reasonable stance to take against that line.
Mac isn't a modern development computer. It's a very specific modern development computer, and not necessarily the best in terms of quality/cost.
We're getting unnecessarily deep into dissecting this single line anyway. The original poster's point was that Apple's hardware and software ecosystem, while of great quality, still isn't the "bicycle for the mind" because it's terribly locked down.
I find that argument to be bizarre. You can do computer science on paper.
As long as you can write programs and run them then who cares whether you compile them directly to the hardware or interpret them, or whatever?
Apple locks down its distribution platform. It doesn't lock down your brain from thinking or writing programs, and it doesn't lock down its computers from running them for you.
Why does it matter that the money goes to the same vendor?
The argument is about whether the software is "free" or not. This only involves the cost to the end user. It is not concerned with where that money is paid.
> You are really wasting time hating on Apple and spinning them (i you really are an apple device owner) since Apple is the one who broke open the mobile device so you could run software with out permission (Before you had to get AT&T for Verizons permission to put code on your phone) and who made high quality development tools and platforms available for free.
Where on earth are you getting this nonsense from? You could install unsigned applications without anyone's permission on old systems like S60 ages before the iPhone.
Apple didnt even have a mobile store at launch, it took third parties to get them out of their web only approach, you couldnt even write apps for the iphone until later patches.
"... but it is not how anyone in their right mind writes C."
If I were going to place blame (I think a better approach is to skip that and just solve the problem), then I would blame developers, not the language.
To me, the code you are criticizing is very "regular" -- it follows predictable patterns. I find it easier to follow than most other C code I read.
More importantly I believe it is worth following. Obviously he is doing something right as reliability, performance and paucity of bugs shows.
All this despite not being in the "right mind", whatever that means.
I wish all the programs I am forced to use were written with such care.
I wish all C was written with such care as well. I just wish they had less pointer arithmetic, less one-character variables and was commented, you know, at all...
If sites start moving to HTTP/2, is it true the untrustworthy code can be inserted into the same stream as the "content"?
Everything could be coming from the same domain/IP? This might make blocking ads and tracking more complicated?
My solution as HTTPS spreads is to MITM my own connections so I can see what is being sent and received over the wire. As the article says, it is a PITA. But it is necessary.
For example, do you have an httpd listening on 127.0.0.1? Do you bind any other daemons to 127.0.0.1 or the broadcast address?
If you operate your own root you can reassign the authoritative nameservers for doubleclick.net to nameservers you control. You may or may not choose to return "A" records.
Hopefully nothing is on 127.0.0.1:80/443, that way webkit gets a RST back, which I assume is faster (and less error prone) than serving a dummy page. I'll add a note to the README.
It could also be less than a page. It could be a dummy resource. For example, in the case of an ad server and a smartphone app that has some screen space reserved for ads. You might want your own resource to appear in that space instead.
Another example is reverse engineering API's and protocols for popular web services, social media, storage, etc. In that case you might want a "dummy server" that serves certain responses.
Blog posts are just too boring most of the time. We need more direct quotes from the people toeing the creepy line.
That is the behavior that should be tracked. What do you think these "engineers, designers and policy makers" get up to each day? Maybe lots of "pretending to believe" they are doing something meaningful?
The last line from the article you quote is spot on.
Why are alleged "services" provided for "free"?
One group will tell you it's because advertisers are picking up the costs for "content". Another group will tell you that it's because no user (cf. advertiser) would pay if a "fee" were charged to use the www.
Of course, no "free" business model will dare test the theory of the later group, so I guess we'll never know how the user values these "services". Instead the investors and advertisers set the value. Grossly inflated.
In the early days of the internet as I remember it the real (non-hardware) costs for the internet were tolls on telephone calls (dial-up). Organizations picked up the tab for employees who used the internetwork. Tuition-paying students also got access.
Then came UUnet and "ISP's". And then people had their own personal computer, at home, with a network card.
As far as I'm concerned, the internet connection fee is still the only real cost.
I think the browser you allude to is possible. But I think some changes in thinking in how information is structured and presented on the www is needed. If we let the www be shaped solely by web developers with a lust for layers of abstraction and increased complexity and being given carte blanche to run code on others' computers, then it forces the "browser" to be something that is far too complex and too much trouble for any open source volunteer programmer to deal with.
Make the www easier to parse and then the www "browser" becomes easier to replicate. This is only my opinion. Others would certainly disagree.
I believe we need a multi-pronged approach to retaking the internet from the forces that dominate it now. One prong is a resistance movement, such as I suggest above. Another is to innovate on better ways to finance content and services on the web, be it micropayments or something else. And another is to find a way to counter or eliminate the perverse incentives that drive clickbait, garbage content and viral shallowness. It is not accidental that I allude to Adam Smith's invisible hand above. He and others knew the key was to understanding the feedback loops. The internet's feedback loop is broken. Clicks and quantity drive revenue, not quality.
And yes, my username is a reference to Star Trek, a show which is probably too socialist for the heavily anarcho-capitalist-leaning libertarian crowd here on HN (See the link in my reply to username223).
I'm working on setting up a website where we can raise awareness, change hearts and minds, and support efforts that help us retake the internet. I cannot do it alone, even with my evil goatee. Email me if you'd like to help.
If a language is effective for its user(s), then is it a "success"?
Is there some minimum number of users that delineates a threshhold for "success/failure"?
What about DSL's that might have a limited number of users?
I like that the author recognized the "success" of shell and C as tied to the "success" of UNIX . But he forgot others, such as sed.
Some my favorite and most powerful languages are not widely used. I have no idea what "succeed" means to others in the context of programming languages, but these languages have "succeeded" for me. They get the job done. Efficiently.
Nothing has surpassed Snobol for pattern matching. (If you are in doubt, post a pattern matching challenge and let's see how the solutions in various languages stack up.)
Meanwhile other implementations of pattern matching have become more popular (=successful?)
Perhaps the word "succeeded" here simply means "succeeded in becoming popular?" If so, then I apologize for the gibberish.