Let me quote the quote from Robert Townsend "Up the organization", quoted in turn by DeMarco and Lister in their "Peopleware" (needless to say they quote it to say its pure crap):
"If you've inherited (or built) an office that needs a real
house cleaning, the only sure cure is move the whole
thing out of town, leaving the dead wood behind. One of
my friends has done it four times with different companies. The results are always the same: 1) The good
ones are confident of their futures and go with you. 2)
The people with dubious futures (and their wives)
don't have to face the fact that they've been fired. "The
company left town," they say. They get job offers
quickly, usually from your competitors who think
they're conducting a raid. 3) The new people at Destiny City are better than the ones you left behind and
they're infused with enthusiasm because they've been
exposed only to your best people."
— Up the Organization
I think "RTO to make people leave" has the same idiotic thinking at its base.
The article's top comment is definitely worthwhile:
Jeff Cunningham
16 hours ago
I have a web-server that has been running since 2008. It started out as a vanity website, written back in the days when a large number of websites belonging to individuals, before the monetization of the web. As that process developed, privacy issues began to rear up and my site went through a series of contractions, to the point were I almost shut it down. But it remained useful to me for several reasons. First, I had written a number of web applications that were very useful to me, personally. Writing a computer application for particular platforms takes a lot of work and must be constantly monitored for compatibility with continually evolving operating system changes. Web applications put that burden on browsers, who provide an applications programming interface which works (pretty much) across various platforms. My applications work on my native Linux machines, my wife’s Mac, our phones and tablets, etc. They enable me to interact with my own records, resources, references, stream my own music, transfer large files to and from wherever I am in the world – without Google, Amazon, Facebook, or any other corporate or government entity looking over my shoulder.
As security became an issue, I changed the site to require authorization to access most of it. The existence of most of it became invisible without authorization. But I left a small number of publicly accessible pages on the site. I had a pretty decent weather station I’d built and had online since I started the site. And I published some codes I’d written that a few people found interesting and led to some email discussions (and one exchange wherein a Chinese student tried to get me to solve his take-home Lisp programming final exam problem for him).
I regularly monitored my server logs – a record of the request traffic it receives. As the monetization of search took off my server traffic exploded. A large amount of it was the big search engines – both foreign and domestic. It got to be ridiculous. In any given period, only a very small amount of the traffic was from “real people” (me, my family and friends, and an occasional stranger steered there mysteriously by search); all the rest was search engines scraping the site – which changed rarely. There are methods one uses which are supposed to control them somewhat, and the big commercial domestic ones seem to obey them. Most of the foreign ones just ignore them.
But the biggest growth in traffic I saw from about the mid-twenty-teens was from hackers and from commercial operations looking for ways to exploit my site or data and sell it. I spent a lot of time learning how to track and classify these and the hackers. The emergence of geolocation techniques (which use multiple world-wide servers to triangulate actual latitude, longitude locations of IP sources based on transit delays) helped tremendously in this endeavor. China, Russia, the UK, and, curiously, locations around Washington, D.C., turn out to be the largest single sources of attacks on my U.S. located site. But there are waves of attack origin that temporarily roll through (lately, Ukraine, Hanoi, Tehran, Sweden and Hamburg, Germany have been prominent).
How my comment here ties in with this article is this: a while ago I relaxed my constraints on the big search engines. And what I discovered was that, while they came back around and sniffed at it, they just moved on without bothering to index it. They simply don’t care about individual sites like mine anymore. They would if I was posting ads on it. Or cross-linking to sites that posted ads. Or selling something. Or buying something. But just to put up information about various topics without any of that? Sorry, not interesting anymore. I am a non-entity to Goggle (in more ways than one). They just are not interested in the content anymore – not if will not be useful to generate clicks to their advertisers.
And I realized that this is what I’ve been noticing with search for quite sometime. It is very difficult to find non-monetized websites with search. There was a time when you could if you dove deep – meaning kept going through page after page of links. Eventually you’d get past the heavy advertising and find a few real, interesting topical pages. Now, after several pages the search engines simply say you’ve reached the end of their search results. The end of the Internet! It used to be a joke. Now it’s a reality. At a time when there has never been more websites online it has never been shallower.
Last edited 16 hours ago by Jeff Cunningham
And even this is selling Singapore’s policy and procedure short.
On top of those broader measures listed and the general level of public cleaning/maintenance, there’s reporting of every dengue infection diagnosed along with the individual’s home address and particulars to the National Environmental Agency. Cases and trends are monitored for developing clusters, publicly publishing up-to-date findings and exact numbers. Where a cluster is found, they do additional anti-mosquito fogging and significantly increase the local public awareness campaign (huge banners, posters in every elevator, distribute leaflets, etc).
They then send agents unannounced to inspect inside every home/unit in the area for any potential breeding grounds. Everything is checked from potted plant trays to dish drying racks to toilet bowl scrubber holders. If any breeding is found, there’s fines in the thousands of dollars. The agents are empowered to enter without a warrant; it’s taken THAT seriously.
The NEA also monitors and takes appropriate action against non-residential areas like construction sites, where standing water is hard to eliminate unless it’s a priority.
I’ve been working on a synthetic DNA assembly company. Basically, I figured out how to assemble DNA for people at a fraction of what it normally costs, so they give me a sequence, and then I make it in real life for them, then ship it to them.
Most of my customers have been AI protein designers, ironically. Turns out SOMEBODY has to wrangle atoms in the real biological world and that’s me!
After almost a year of work I finally smoothed out all the kinks in the process, so can now go from a design to synthetic DNA in a cell in about a week (not counting oligo pool synthesis time). I can do about 600,000bp per week, which is large enough to synthesize the smallest bacterial genome (each week), tho I only do about 1000bp fragments. I’m also completely bootstrapped and self funded, and only get help from my several opentrons robots
For a low-math text on aeronautical engineering at its core, the FAA's Pilot's Handbook of Aeronautical Knowledge [1][2] is hard to beat. It won't talk much about jet engines, though.
Real Engineering [3] and Mustard [4] tend to do a decent job surveying specific plane designs. And the Wikipedia pages are decently peppered with references.
Otherwise, getting into the weeds on engine design [5] and launch mechanics [6].
I was an early Tableau adopter and promoter in my org many years back.
At the time the incumbent, Qliksense, was lagging badly and there was genuine grass roots support for Tableau as our next gen BI tool. Adoption grew significantly and people were happy...for a while. I remember reaching out the founder/CEO at the time and getting responses back. It was great.
Fast forward today and its different, over time Tableau dissatisfaction grew, I think a lot of it was the fact that our use-cases got more advanced and performance dropped and not being able to understand the complex SQL it was generating didn't help. Also Qliksense got noticeably better UX and importantly was cheaper. Today, MS PowerBI which came from left field, is really the front runner for us.
Parking this situation for a second I think the real issue is that people are getting a bit overwhelmed with dashboards. I have access to near 100 across the three platforms, they more or less do the same thing and are thus somewhat commoditized, I don't feel passionate about any of them as I did back in the day with Tableau. Worse, I have to spend mental bandwidth figuring out which one I need to open to answer my query, then I have to spend time navigating the UX to get my answer so I can move on with the task in hand.
We don't necessary need more dashboards, just faster ways to go from question to reliable answer where information can be pushed rather than pulled, especially if data points start looking anomalous over time.
Note - If your a startup in this field looking to upend the BI space reach out, I have spent a lot of time in this space over the years.
Great read! This reminds me of a macOS app I made for my wife a few years back. It keeps track of the opening hours of all her favorite shops, and she can click a menu bar icon to see how long until each one closes today. It also warns if it's currently peak/rush hour for the shop, since she prefers to go when it's less crowded.
It's a simple Qt app that uses a text file for data storage. I wrote it after noticing that she had trouble remembering which shops are open when. I asked her what to call it, and she said "Gladiolus, like the flower" so I named it Gladiolus.
I can say for sure I've never had a more appreciative client as a programmer than the one user of Gladiolus :^)
In the late 90's, my dad was a screen printer and did a bunch of big runs of garments for Walmart stores. He'd built a good size business, doing regular jobs for Disney, action sports, NFL, etc, building a shop that usually ran two shifts of hundreds of employees. It's not an exaggeration to suggest that if it had a screen printed image on it, we probably printed it.
> very exacting in pick up and delivery times
Well, for us, not exactly. For our last contract with Walmart, there was a lot of language about timing. If the driver arrived and the order wasn't ready for pickup, we'd be penalized for every hour of delay. The penalties were such that you could end up losing crazy money on the job if you weren't confident about your abilities to deliver on time.
Not a big deal for us since we were tight, and could easily handle it.
Then Walmart arrived a day early. The driver had made a choice to stop for us first, which, after pulling up the contract, we realized it allowed for that. Our lawyer had determined it was vague enough that the agreed delivery date would prevail, contractually. Which is probably why we didn't notice that the contract also said the pickup clock started the minute the driver arrived, and that was also the official pick up time. Had we caught that during contract negotiations, we'd have refused the terms. But we missed it.
We weren't a small shop. We already ran two shifts, and quickly needed to pull in a third to be able to finish in time to get any profit out of the job.
We almost did that last contract with zero profit, and refused all business from Walmart after that day.
Thing is, I've heard of similar stories where Walmart would do similar tricks with produce from farms. Whether myth or not, I don't know, but I know with certainty how they treated us. There were probably incentive structures in place for drivers to show up early, and some of them had the balls to do it a lot earlier than others. In this case, they could give the driver an bonus for his better than on-time averages, then "buy" goods worth a retail $5M+ for the cost of the driver's salary and bonus, plus whatever the supplier (us) managed to keep after penalties.
> Electrochemical, magnetohydrodynamic and various other systems are not properly characterized by the Carnot limit and can approach much higher efficiency limits.
That usually is a distinction without difference. Other effects limit the efficiency at far lower levels than pure Carnot.
This can be intuitively understood like this: Carnot limits apply to gases, that are the simplest interacting systems. You can realistically model them as just a collection of individual independent particles interacting only via simple collisions.
Anything more complicated like electrons in semiconductors, and you have way more interactions and way more possibilities for your system to have inefficiencies.
Take, for example, solar panels. Sunlight has the optical temperature of around 6000K, so a Carnot engine that uses 6000K hot part and 300K cold part will theoretically have a 95% efficiency. And an infinite stack of solar panels with each layer tuned for a specific wavelength has the theoretically maximum 87% efficiency.
These requirements are pretty tough to meet. The above answer about using KeePassXC on desktop, and syncing the db to your phone is probably the best solution, as it meets every requirement except not switching password managers.
If you like Bitwarden, it will do what you want if you pay for their premium account. If you don't want to do that, you can host your own bitwarden server (I think that this implementation does 2FA, but I'm not positive:
Another LaTeX-to-HTML tool is lwarp (https://github.com/bdtc/lwarp) which starts from the idea that there only exists one program that can parse LaTeX: the LaTeX compiler itself. Implementing a new parser is almost futile. So instead, the lwarp package redefines all the macros to output HTML. Something like
\renewcommand[1]{\textbf}{<b>#1</b>}
This way, compiling LaTeX gives you a PDF whose text is HTML code, so now you can extract the plain text from it and you have an HTML file. The advantage is that it can easily deal with custom macros etc., because these are natively resolved by the LaTeX compiler.
I use lwarp to make https://tikz.dev/, an HTML version of the TikZ manual, which is probably one of the most complicated LaTeX documents in existence.
"If you've inherited (or built) an office that needs a real house cleaning, the only sure cure is move the whole thing out of town, leaving the dead wood behind. One of my friends has done it four times with different companies. The results are always the same: 1) The good ones are confident of their futures and go with you. 2) The people with dubious futures (and their wives) don't have to face the fact that they've been fired. "The company left town," they say. They get job offers quickly, usually from your competitors who think they're conducting a raid. 3) The new people at Destiny City are better than the ones you left behind and they're infused with enthusiasm because they've been exposed only to your best people." — Up the Organization
I think "RTO to make people leave" has the same idiotic thinking at its base.