Hacker Newsnew | past | comments | ask | show | jobs | submit | gliderShip's commentslogin

It must also be implememted in toilet paper packs. Last roll should be of different color.


I have seen some brands that do essentially this, e.g.: https://m.media-amazon.com/images/I/61wjeonOCdL._AC_SL1500_....

Personally I just keep a few rolls of single-ply Scott on hand for backup if my stock gets too low.


When I worked in retail the till roll (i.e. for the cash register to print receipts) had a red streak on the last metre or so.


Kleenex tissues have this feature where the last several tissues are of a different color.


Torso


That's still not specific enough. Any leading part of the torso, centroid, or average of the leading edge of it?


The rule is that the winner is the first person whose torso, defined as trunk of the body, reaches the closest edge of the finish line. In practical terms that means shoulder line when leaning forward, as is evident by image in the article.


if you want to generate beautifully paginated pdf / printable pages from your html i cannot recommend enough https://www.pagedjs.org/. it supports page numbers, tos, footnotes, dynamic references (see table on page x link ) and a lot of other very cool stuff


Thanks!! This looks super interesting. I couldn’t find any decent solutions except princexml but I will give this a whirl

The fact that they are achieving it via poly fill sounds like a genius approach!


I would love a sci-fi tv show exploring this idea.


Kind of like this movie?

http://www.imdb.com/title/tt1637688/


Visiting the first link (Dell 15.6-Inch Gaming Laptop) on amazon the price i see is $799.99. Visiting in private browsing https://www.amazon.com/Dell-15-6-Inch-Quad-Core-i5-6300HQ-Pr... the price changes to 699$. Is this standard practice for amazon?


Yeah, this has been happening for a while: http://www.slate.com/articles/business/moneybox/2010/12/how_...


I wonder if this is just a coincidence of timing between your visits. I went to the link in a normal browser tab, an d it was $699 the first time. Perhaps Amazon just happened to change the price between your visits.


If you need to buy plane tickets, it is also recommended.


The time of day is also an important factor in smart pricing.

Usually everything goes up when people is not asleep or at work.

Like if prices were different at the mall when it is full.


Same here. Interestingly the $799.99 non-incognito version stays the same price if you change the i5 to an i7. Possible bug?


the i5 and i7 heaving the same price makes sense because the i5 model comes with a ssd vs the i7 which comes with a hdd (hybrid).


This is why I always use camelcamelcamel


Reminds me of [Why I Hate Frameworks] http://discuss.joelonsoftware.com/?joel.3.219431.12


Well No, There is an upper limit on the damage a bad driver can do by say crushing his car with a bus or something like that. Imagine a bug or malware triggered at the same moment world-wide. It could kill millions. So it not as simple as 'It just has to be better than a human'


I've been itching to release this terror movie plot into the wild:

It's 2025 and more than 10% of the cars on the road in the US are self-driving. It's rush hour on a busy Friday afternoon in Washington, DC. Earlier that day, there'd been a handful of odd reports of self-driving Edsels (so as not to impugn an actual model) going haywire, and the NTSB has started its investigation.

But then, at 430pm, highway patrol units around the DC beltway notice three separate multi-Edsel phalanxes, drivers obviously trapped inside, each phalanx moving towards the Clara Barton Parkway, which enters DC from the west. Other units notice four more phalanxes, one comprising 20 Edsels, driving into DC from the east side, on Pennsylvania Avenue.

At this point, traffic helicopters see similar car clusters, more than two dozen, all over DC, all converging on a spot that looks to be between the Washington Monument and the White House.

We zoom in on the headquarters of the White House Secret Service. A woman is arguing vociferously that these cars have to be stopped before they get any closer to the White House. A colleague yells back that his wife is one of those commandeered cars and she, like the rest of the "hackjacked" drivers and passengers is innocent.


I'm afraid Daemon (novel) beat you to the punch. It's an excellent novel, about fairly similar situations.

http://www.goodreads.com/book/show/6665847-daemon


And a really fun read. If someone here decides to get this book, get the sequel as well - Daemon ends on kind of a cliff-hanger.


Freedom (TM) is the sequel, and the author (Daniel Suarez) has a few other near-term what-if-this-all-goes-skynet books which are equally good.


Thank you! I didn't know there was a sequel. I really enjoyed the first book.


One of my favourites, the sequels where similarly as good.


Daemon is in that area but I'd for sure also enjoy the original proposal.


Cool plot line, I'd go see that movie.

A related scenario, one that theoretically could happen today, is hacking into commercial airliners auto-pilot systems, and directing dozens of flights onto a target.

Set aside the fantasy movie plot angle, how realistic is this today? Is it any more or less plausible than the millions of cars scenario? If people are truly concerned about the car scenario, shouldn't they be worrying about the aircraft scenario?


I will disagree with the other commenter and say that this is more plausible for the aircraft than for the cars. Modern jetliners and military aircraft (scarrier yet) are purely fly-by-wire - there aren't cables running between the yokes and the control surfaces like in a Piper Cub, and if there were, no pilot would be strong enough to move them.

Yes, the autopilots can be turned off, but that's just a button, probably a button on the autopilot itself. Depending where the infection happens, the actual position of the yoke could be entirely ignored by the software. Or the motor controllers for the control surfaces themselves could be driving the plane, though I don't know how they could coordinate their actions and get feedback from an IMU.

Perhaps the pilots could rip out components and cut cables fast enough to prevent the plane from reaching its destination, and maybe they could tear out the affected component and limp back to a runway with what remains, but it's an entirely feasible movie plot.

But should we actually worry about either? No. The software sourcing, deployment and updating protocols at the various manufacturers of aircraft are certain to be secure. Right?


In 2007 the FAA revealed the Boeing 787 had passenger Internet traffic and flight control traffic on the same network separated via software firewall.

This gives us the classic reassuring response from Boeing spokeswoman Lori Gunter :

"There are places where the networks are not touching, and there are places where they are," she said.

http://www.wired.com/2008/01/dreamliner-security/


"had"? So it's fixed now, did they rewire the whole network?


Oh, I don't know if they ever air gapped them fully. The FAA made them do some changes is all I know.


> Yes, the autopilots can be turned off, but that's just a button, probably a button on the autopilot itself.

Airplaine components tend to have shitloads of fuses for each components, any trained pilot knows how to disable the fuse for the autopilot system (or, in an extreme case, ALL fuses to kill the entire airplane).


In the airplane case, it's possible today: https://m.youtube.com/watch?v=CXv1j3GbgLk

And

https://m.youtube.com/watch?v=Uy3nXXZgqmg

TL;DR you simulate a bunch of other planes in close proximity and the auto-pilot freaks out and tries to avoid them. As the second talk explains, the pilots would definitely notice and switch autopilot off. This is why IMO it's very important to not take ultimate control away from humans in cars. I would personally never buy one of the Google (or any other) self-driving models with no controls. It already freaks me out that many cars are drive-by-wire (for the accelerator), and now even steer-by-wire: http://www.caranddriver.com/features/electric-feel-nissan-di... #noThankYouPlease


No current airliner will automatically change course in response to a traffic conflict. If TCAS [0] gives an advisory, the pilot takes manual control or reprograms the autopilot. Spoofing transponder returns wouldn't do much to the aircraft except annoy the pilots.

Another reason traffic spoofing wouldn't cause the aircraft to deviate is that airliners fly standard approaches and departures (STAR [1] and SID [2]) and heavy traffic away from the approach paths would definitely get noticed.

Even the fly-by-wire Airbus can be flown manually using differential thrust and/or pitch trim control.

The only time I've heard of an Airbus loosing control of a damaged engine is when the electrical cable was physically severed. This was Qantas QF32 [1], after one engine exploded and damaged the cables to another engine.

To "take over" an aircraft with pilots in the cockpit, would require the compromise to multiple systems.

[0] https://en.wikipedia.org/wiki/Traffic_collision_avoidance_sy...

[1] https://en.wikipedia.org/wiki/Standard_terminal_arrival_rout...

[2] https://en.wikipedia.org/wiki/Standard_instrument_departure_...

[3] https://en.wikipedia.org/wiki/Qantas_Flight_32


> I would personally never buy one of the Google (or any other) self-driving models with no controls.

Google cars have the Big Red Button, which shuts off self-driving system and brings the car to a stop.

What more controls do you need?


When are you barreling down a highway at 65 miles per hour, turning off the car might not be the best solution.


When you are barreling down a highway at 65 miles per hour and are not paying attention (and you wouldn't, because the car drives itself just fine), giving you controls is much more dangerous (for you and others around you) then not.

Urmson talks about it here: https://youtu.be/Uj-rK8V-rik?t=14m3s


If a fuel injection system were to fail via fried component or even a short would trip a fuse and cause it to fail safe by cutting fuel and shutting off the car. Fuel Throttle cables however have definitely become stuck in their sheathing in the WOT position. Happened to my dad on the highway in a 1992 Rodeo Isuzu.


> I would personally never buy one of the Google (or any other) self-driving models with no controls.

It won't matter if all the other cars on the road besides yours don't have controls.


A minor point, but the electronic accelerator control in autos is called "throttle-by-wire."


I've always seen that called EPC for Electronic Pedal Control, but that is probably VW-ism.

On the other hand on EFI car, having mechanical throttle cable does not add much to hack-safety as the ECU always has some way to override closed throttle (either disengaging throttle pedal mechanically switches the control of throttle to ECU operated servo or there is completely separate throttle controlled by ECU).


> hacking into commercial airliners auto-pilot systems, and directing dozens of flights onto a target.

I would imagine that any pilot would figure out what was going on, unless it was on an incredibly foggy day.


It's Hollywood, name one movie where the villain did not disable the manual override. That's villainy 101.


Sure, but rayval was talking about a scenario that could happen today.

Although looking at the other comments, I think I'm significantly underestimating just how much of modern airliners is dependent on software. The pilots might be able to see that they're heading for disaster, but may not be able to do anything about it.


I know for a fact that there are 3 separate computer systems from 3 separate manufacturers on each Boeing airplane. Auto-pilot always uses the consensus of the 3 machines. It's a pretty far-fetched scenario in real life so I thought we were talking fiction.


Former Boeing software engineer, worked on engineering simulators (where real hardware was in the loop):

There is an idea of triple channel autolanding, wherein the plane uses the consensus of the three autolanding systems. Should no consensus be available, then the pilot is advised that autolanding is not available.

Other than that, any sourcing from different manufacturers is happenstance. 737 avionics are sourced from a different vendor than 747/757/767/777. And different functions can come from different vendors, although vendor consolidation has cut down on that.

I'm not across what happened post 777, as I left Boeing in 1999.


This is a good movie plot, it just has one huge plot hole:

>It's rush hour on a busy Friday afternoon in Washington, DC.

>each phalanx moving towards the Clara Barton Parkway

>all converging

DC rush hour? Moving cars? Please. Independence Day made me suspend less disbelief.


Maybe it'd be waves of vehicles, the first few waves crashing through traffic to make way for the rest of the swarm... I'm surprised I hadn't heard of this doomsday scenario yet haha


You don't need self driving cars for such a scenario to happen -- cars are increasingly drive by wire, and driver assistance features being added to cars (automatic lane keeping, automatic breaking, smart cruise control, etc) mean computers are already capable of taking over cars.


I've come to that realization when driving my Leaf.

You see, with an internal combustion engine, there are several ways that you can stall the engine, even if the computer is controlling it. As long as you can stop it from rotating, it will stall.

Now, take a Leaf. The engine can't physically stall. It is completely controlled by electronics – in contrast, even an ICE with an engine control unit will have some of it being driven mechanically (valves and driveshaft are all mechanical). This also causes cars to "creep" when you release the brakes, as the engine has to keep rotating. In the Leaf, the "creep" exists, but it is entirely simulated.

Similarly, the steering is also electric and controlled by algorithms (more assist in parking lot, less in the highway).

Braking is also software-controlled. The first ones had, as people called them, "grabby breaks" (it would use regenerative breaking with a light force, if you pressed more, the breaks would suddenly "grab" the wheel). This was fixed in a software update.

Turning on and off is also a button. Can't yank the keys either.

So yeah, presumably, a Leaf could turn on, engage "drive" and start driving around, all with on-board software. It lacks sensors to do anything interesting, but the basic driving controls are there.

Good thing it cannot be updated over the air.


I can see it now - a modern day tiananmen square as a lone figure stands in front of a long line of Edsels


It's Will Smith, he's the president, and he just got finished dealing with a one of his super-cute children acting up, but actually doing something noble (but getting in trouble at school for it). Upon seeing the swarm: "Aww helllll naw"


Autonomous machines gone haywire?

Gratuitous CGI?

Will Smith saying "Awww, hell naw"?

https://www.youtube.com/watch?v=L1UxZJ9owXY


Preventing the terror plot because the "do not drive over humans" goal overrides the "navigate to preselected target" goal?


my first thought actually was, how would the cars react to road spikes thrown by police? Especially on a highway with barriers on either side.

Of course a competent writer would've thrown in a line about how these cars are on run flats at some point...

Our only hope is for the scientists in the So-Secret-President-Doesnt-Even-Know Facility to come up with something so crazy it just might work


An extremely selective and directional EMP cannon springs to mind as being sufficiently Hollywood


There are actually already physical barriers throughout the governmental parts of Washington DC to prevent this sort of thing. There are permanent walls, blocks, and poles along the edges of the roads (often well-integrated into the architecture,) and raise-able barriers built into the road surface at intersections. Ain't no cars running over our president!


Fun. Johnny555 is right of course, and it was the Jeep hacked on the Interstate last year that first inspired me. The airplane scenario may be more likely, but I liked the additive nature of so many cars, cars a bit like insects. (You could go further in this direction by conjuring up multiple drone swarms.)

As to movie points: of course Will Smith is the hero, and we'll handle DC rush hour stasis through special effects. ;-)


You're over thinking it. Hack into onstar, brick every connected vehicle at hh:mm and have some gunmen start shooting at hh:mm + 1min.


Sounds vaugely similar to speed[1] (okay just in the sense that there's a vehicle that won't stop)

[1] https://en.wikipedia.org/wiki/Speed_(1994_film)


I'm less worried about vehicle incidents on flat surfaces and more worried by anything happening on mountain roads with deadly drops on one side.



If a bug can kill millions then it's not "better than a human" though, right?


Car manufacturers conduct recalls all the time. There might be the possibility that a million self-driving cars will be held hostage from a remote control tower simultaneously leading to injury or death to millions. However, in practice, as soon as an issue is discovered, there will be the equivalent of recalls (remote updates) and things like this will be fixed. People who are uncomfortable with self driving cars will always be able to drive manually or override the automated controls. At some point, technology will progress enough that the benefits will outweigh the risks and people will adopt.


Car manufacturers are some of the last people I trust to be doing software updates. The recent Takata airbag recall is an example of the ensuing fecal tornado from large recalls: http://blog.caranddriver.com/massive-takata-airbag-recall-ev...

In some cases, people are having to wait months to get new airbags because they just don't have them in stock. In the computer case, would you want to keep driving until they can get you scheduled for a software update? Remember that many cars can't update critical software OTA.


> Car manufacturers are some of the last people I trust to be doing software updates. The recent Takata airbag recall is an example of the ensuing fecal tornado from large recalls: http://blog.caranddriver.com/massive-takata-airbag-recall-ev...

>In some cases, people are having to wait months to get new airbags because they just don't have them in stock. In the computer case, would you want to keep driving until they can get you scheduled for a software update? Remember that many cars can't update critical software OTA.*

So I assume you don't own a car and you avoid them at all costs? Otherwise your paranoia becomes hypocrisy. If you cannot trust the car company to deliver software updates, you can't trust them to write the software in the first place, and modern cars are full of safety-critical software.

I also don't know why you're equating the a manufacturing capacity limitation with a software update limitation. It's not as if Toyota is going to have trouble shipping bits a million times vs a thousand times once the software update is written.

I think we can also safely assume that self-driving cars will generally be updatable OTA. But yes, you could drive it to the dealer if needed, and worst case the dealer could send people on-site to do the update.


Allegedly, Honda is offering rental cars to customers who are concerned about their safety and there are no parts available to repair their vehicle.

I say allegedly, because my local Honda dealership told me to pound sand when I asked for a rental car for the day they needed to repair my CRV.


A friend of mine works for VW's engine computer division. Yes, those engine computers. After all I've heard of their development methods (or lack thereof), I'm surprised the engines even start more often than one time out of ten.


My VW Golf has a bug where the driver's side door will be completely unresponsive after starting the ignition, with all the lights on the door being off too. After 5 to 10 seconds it will become responsive, which is a bit annoying if you're trying to open the windows to clear the damp mist on them, as you can't....

Also, if during normal routine you run through all four electric windows to close them (so passenger, driver, passenger rear, driver rear) in that order, you hear the solenoids click in a COMPLETELY different order. I am not sure if it is prioritising the messages in some way but the order that the windows "click" is not the order I press the buttons.

Also, I can get the CD player to crash.

Such minor noticeable issues make me think about the quality of the more important bits somewhat.

The breakdown on Toyota's safety code was interesting; and frightening really.


Care to elaborate? Kernel developers are not practising SCRUM or TDD yet are shipping a fairly stable product.


But they do know what a VCS is, and they don't re-invent lint because they want to ship broken code and need to only check for 2-3 minor issues while leaving the rest alone, as they need to rely on "magic" code exploiting undefined behaviour in certain hardware+compiler combinations.


People won't be able to override a buggy software: they can't even do that now, just look at the remote Audi and BMW hacks that can brake the car on the highway.


Or the jeep: https://blog.kaspersky.com/blackhat-jeep-cherokee-hack-expla...

Auto mfgs seem to be about 20-30 years behind when it comes to computers. Not really surprising that Tesla is whomping them on this front, given how SV people are scrambling to work there. You don't see that with the Big 3 or really any other car mfg.


Depends on how unlikely that bug is :)

probability x value, etc.


There is something that's called fail safe(ly). In case of any error inside the car, it should sowly decelerate and pull over. Sure, it will cause a lot of traffic problems if say 10% of all cars did that at the same time but the damage would not be as severe as a ghost driver entering the freeway with 140mph.

There are a few instances where some bad tesla batteries (the standard 12 volt batteries ironically) failed, and the cars handled it perfectly. It slowed down so that the driver could do safely pull over. Sure it did not happen to all cars at once, and autonomous cars migh not be able to do that by themselves but we have a log way to go to reach 100% autonomous driving (i.e. Without a steering wheel and a car that drives everywhere humans drive and not only on San Francisco's perfect sunny roads where it's been thoroughly tested on).


Google had fail safes which failed. What if a million cars pull over to the left instead of the right as a failed failsafe? Willing to risk your life?


Yes, I'm eager to risk my life on self-driving car failures; it would be a tremendous step up from risking my life on human drivers (including myself!) as I do on a daily basis.

I'm also a biker. In 2013, 4,735 pedestrians and 743 bicyclists were killed in crashes with motor vehicles. http://www.pedbikeinfo.org/data/factsheet_crash.cfm

In the future when self-driving or at least augmented driving is commonplace, I hope that number will be a lot lower.


I think it still is that simple. Clearly killing millions is not better than human control, so would be unacceptable.


Sounds like current hardware defects and recalls for vehicles, with the difference being it's easier to fix.


Imagine a leap second bug causing vehicles all over the US to crash into NYE revellers.


I would rephrase your comment as "To pay taxes there has to be some services first. But yeah i totally agree with you.


Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: