Fully agree with you here. Wanted to also add that even the 'parsing' of text is easy by using standard tools like awk, sed etc.. rather than writing custom code to debug binary protocols. In addition with protocols like JSON, it is easily extensible without worrying about where new data fields are added as long as backward compatibility is maintained - i.e. if designed correctly, the server can be upgraded to accept new fields while still working with older clients.
Disclaimer: I come from C++ and PHP world, but in the last few years, have been working in Java after a long gap.
So here are the pros and cons of Java I see (not comprehensive but somethings that I tend to care about):
Pros:
1) The strongly typed features does help in finding potential bugs during compile time.
2) The Generics and Container frameworks seems fairly sane when compared to the complexity of C++ templates.
3) Concurrency features are nicely standardized now - Executor framework, concurrent container frameworks and so on.
4) In short, 'stock java' (without all the heavy-weight frameworks) seem nice comfortable and sane to work with.
Cons:
1) Fairly large and legacy XML-driven frameworks like Spring and Hibernate is a huge time-sink to figure out all their intricacies, not to mention very hard to debug things esp. when some 'business logic' resides in these mysterious XML incantations.
2) While not really a Java problem, but due to these humongous frameworks, there is a tendency to build giant war files with code and configs all bundled together. Need to make a config change? Check it in some xml file, and then build to deploy wars/jars. It takes some work to separate code from configs, and convincing traditional Java engineers to move away from this model - i.e. check configs separately, deploy them independently, restart servers etc.. without any need to build and deploy process.
3) Legacy app-server based architectures where there is apache in the front with IPC overheads to a Java app-server.
The way I get around some of the cons is to convince the engineers to sparingly use the frameworks (just enough to get some basic routing rules to map urls to controller entry points in the MVC setups), use more lightweight servers like Jetty (directly listening on socket ports), esp. when it is just handling some API requests, and sticking with JSON as the payload format.
It is been a challenge but I have had success in using Java in very limited ways, and decoupling it from serving web (jsp) pages and so on.
Some of the best programmers I have seen have a non-CS background, and were self-taught.
Based on how I interview (and also on various interviews I have attended), here are some suggestions on how to begin:
1) Start with some basic data-structures:
a) Arrays
b) Linked List: both single and double
c) Hash Tables
d) Binary Trees with some familiarity to one or two types of balanced trees - e.g. Red-Black and/or B-trees.
e) Stacks and Queues - built on arrays and/or linked-lists
If you can practice it with writing some code in C or Java, that would be a big plus. You can get a hang of the aforesaid structures by coding the usual functions - find an element, insert, modify, delete, iterating through all the elements etc...
As you are doing it, get a feel of the big-O notation - i.e. understand the various tradeoffs on why you would pick a certain data structure to solve a particular problem
2) Get some exposure to search and sorting algorithms and understand the various trade-offs - maybe start with linear and binary search, insertion sort, merge sort and quicksort (for the latter two, you will get an idea of divide-and-conquer strategies, recursions and so forth).
Most advanced data-structures/algorithms are built on the aforesaid foundations, so with (1) and (2) alone you have strengthened it, and based on your interest, you can dig down further if needed to related areas - e.g. graphs, linear algebra problems etc...
3) Get an exposure to OS/System level programming if possible - e.g. if you choose Linux, you can get some idea of processes vs threads, schedulers, memory mgmt, file and network IO etc.. At the very least, writing some toy code here, say opening files, forking a process, socket IO and so forth will give you some practical exposure.
4) HTTP/Web services: You already mentioned meteor.js so it looks like you are already familiar here esp. with tools like Firebug to see raw HTTP payloads and so forth.
5) Databases - SQL, and maybe some NoSQL use-cases.
6) OOP and Design Patterns
As you get a grip on the aforesaid areas, the more code you read, it will add to the practical use-cases. For some of the topics, you could get familiar with the area using any of your favorite scripting languages (including posix functions for point 3).
Give yourself ~ 6+ months and get as much exposure with practical use-cases.
For points 1 and 2 coursera.org has a great set of courses on algorithms[1], Taught by a Princeton professor. I started the first one a couple of months ago, but had to drop out because of time constraints. I've re-enrolled for the upcoming section. It has a companion book, Algorithms[2], that I bought which was a great help. I don't have a copy of CLRS[3] But I've heard it is also a great reference and is on my amazon wishlist. Other than that, MIT's OpenCourseWare[4] also has the entire CS degree online for free. It's funny, reading your post I thought I hide sleep posted or something. You seem to be experiencing exactly what I'm going through. I'm also 27 and don't have a CS degree. I am highly motivated though, and have no problem hacking at this stuff on my own. I've talked to a bunch of CS majors who actually still feel inferior after they graduate so other than helping the impostor syndrome[5] that I suffer from, I'm not sure that a degree would do me a lot of good. (With the exception of padding my resume and getting me in the door for an interview).
[1] https://www.coursera.org/course/algs4partI
[2] http://amzn.com/032157351X
[3] http://amzn.com/0262033844
[4] http://ocw.mit.edu/index.htm
[5] http://en.wikipedia.org/wiki/Impostor_syndrome
True, but those are not always possible, nor can cover everything.
What struck me the most was the criminal incompetence of the developers, both of the total system as it moved to software control and especially the software itself. Not to mention the Crown Corporation's response to the problem.
which eventually uses Pu-239 from the fast-breeder reactors (2nd stage) as the neutron source to take it to the 3rd stage.
IIRC, the original plan predicted to reach the 3rd stage in the 90s', but due to various sanctions, it is still stuck in the first stage; probably it will speed up now due to the Indo-US nuclear deal during the 2nd term of the Bush administration.
Big fan of kqueue() mentioned in the article. IIRC with sockets, it not only tells you if the socket fd is ready (say for non-blocking read), but even informs the number of bytes available to read which allows you write efficient code (i.e. not to some fixed buffer where you may need to loop in again to see if more data needs to be read).
Also I think for files/directories, you can listen for any changes that occur.
Note: I tend to use frameworks largely to enforce safe db queries. IIRC, Kohana does all the needed escaping transparently. Might seem like an overkill but it is better than relying on the discipline of individual developers.
Kohana seems to have the structure in about the same way, but is more complicated and has some basic modules, whereas Kamele does not.
Kamele does have a nice Database class that handles the PDO-object in a MySQLi-like way of approching the object. Especially the Database::safeQuery() function will be interesting for you: https://github.com/goldenice/Kamele-Framework/blob/master/sy...
Nothing unusual about the lock-in strategy. I would argue Apple does the same with iTunes/App Store walled-garden, same for Facebook, and Google - their Android playstore, GMail/GDocs esp. for small business.
It is been a dream for many companies to own the full stack - client onwards all the way till the server. MS tried with Active-X controls in their browsers which only worked well with IIS servers. Apple is pretty much doing the same with their offerings.
It is a must-read esp. if you are inclined towards cultural determinism. IMO, he seems to persuasively argue against all that, and provide good counter-points on how people can change over time.