Hacker Newsnew | past | comments | ask | show | jobs | submit | mikeyk's commentslogin

I took CS193P when it was first offered in 2007; one of my favorite classes at Stanford because it was so hands-on. At the time few people had iPhones, so everyone in the class got a free iPod Touch for development. My final project was a photo sharing app with a Polaroid shake to reveal mechanic… lightly influenced Instagram which Kevin and I built a few years later!


I can't seem to find the 2007 webpage (maybe it was one of the wiki-based ones?) but the 2008 syllabus looked very hands on: https://web.archive.org/web/20081208171743/http://stanford.e...

I never took 193p, but I always found 148 to be hands on, and I made it very hands on for the year I contributed: https://web.archive.org/web/20130522184434/https://graphics.... .

I regret that we put my subdivision assignment as the last one, and we allowed students to skip one assignment. Most students skipped it, but those that did the work thought it was super cool to have their own subdivision tool for making smooth meshes.


If you were a student in 2025, is CS193P (looks swiftUI rendering heavy) still the hands-on foundation for the next-big-tinkerer or would it look more like building around affordances of AI? (or something else).


God I love Hacker News.


You built a solid app!


Wow. Thank you for your service


Yeah. Instagram was lovely. It might be disheartening to see what it became, what it does to people's minds for a profit, the costs for society as a whole.


how did course exist in 2007 ? App Store and sdk was released in 2008


ah you’re right, I was off by one! 2008 was the year.


Love the throwbacks.

I wasted a few minutes earlier today trying to find the original website for the Cocoa class that Tristan helped set up a few years before this one got started.


Beautiful tribute — you captured Peter perfectly.


Thanks Mike. Big hugs.


Best I’ve found are the ones at Homestate in LA: https://www.myhomestate.com/


IG co-founder here: users 1 and 2 were our first two attempts at creating users end to end when getting Instagram v0.1 hooked up to the backend we'd written. There was a bug so they were left in an incomplete state; post bugfix, 3 was my co-founder Kevin and 4 is me.


Two ghost users, left in an incomplete state by a bug in a previous version of the codebase... you basically created the Twins from The Matrix!


Hey Aaron, sorry to hear and thanks for posting. We're hiring at Instagram for roles in NYC, SF and Menlo Park; if you/folks on your team want a direct line feel free to email me and I'll connect them to the right folks on the team.

mike [at] instagram [dot] com.


Browsers are all over the place, unfortunately. It's part of why sRGB because the only reasonable color profile for Web use. I think we'll see wide color become common in apps before the Web.


All over the place in what way? Support for different color profiles? Actually handling color spaces at all? The fact that there's no consistency when it comes to untagged images? The mess which is plugins? The ability to specific CSS colors in a specific color space?


We built this in already! We don't have a "1x" or "2x" indicator, but the dual lens camera is fully used in Instagram now and will do the smart transition between 1x>2x optical and 2x+ digital zoom.


Oh no way! That's awesome, apologies for not knowing this and thanks!


They mentioned this in the post but it was fairly well hidden, can't blame you for missing it.


I used the same approach as the Webkit image, so the same applies here, too (it's also why we only serve Display P3 photos to iOS clients with wide color screens, most Android devices would treat them incorrectly)


Thanks for the confirmation!


Good to know--I didn't run it through my Pixel. Some devices will do a relative projection from Display P3 to sRGB, which means that it will look "relatively" like it would in Display P3 but projected onto the sRGB color space.

Edited to add: and some other ones are doing something even less fancy, which is just to ignore the color profile entirely and just assume sRGB and display it incorrectly, taking for example what would have been the maximum red point for Display-P3 and making it the maximum red point in sRGB.


Since you brought up your Pixel: what is the point of adding something 1% of your customers maybe can see instead fixing that horrible horrible compression that makes uploads from android (80% world wide market share) look like crap?

(Not an android user, I just want to figure out how a company of your size prioritises between bugs and features.)


I can't speak for OP but say this does effect 1% of users today, what percentage does it effect in 6 months, or a year? Not bad to be proactive.

And regarding android compression issues, although resources are always finite, I imagine in this case the android team is fairly separate, so they may very well be working on that compression issue while iOS is pushing forward into new terrain.


Instagram should hire some faster Android developers if that is the case, it's been an issue since 2012:

https://www.xda-developers.com/a-look-into-instagrams-compre...


> I imagine in this case the android team is fairly separate

This is likely it, right here. So many people forget that larger companies have different teams working on different things. I bet a lot of their "iOS people" that are working on this project have no clue how the Android app works, and Instagram likely has a separate team working on the compression issues.


"Working on" since 2012?


I don't use IG, so I wasn't aware of that problem or how long it had been around. That said, the general sentiment stands: one of their teams working on one thing doesn't show that they don't have another team working on something unrelated.


> (80% world wide market share)

Not of instagram user. Not of app users.


Well, given how Instagram has treated its android users can you blame them?

I've seen a number of SV companies releasing ugly and buggy android apss then use their lowering android user base as a proof that android users use like their services.

To be honest, things could be worse. You could be a tinder user in win phone...


We started using OpenGL in 2011. Our CPU-based image filters used to take 4+ seconds with pixel-by-pixel manipulation, and now can render 30+ times per second.

If you have some sample images where the current image pipeline is going wrong let me know and we can look into improving.


I think that's back when Gotham died :/

http://randsinrepose.com/archives/rip-gotham/


I also went through that process with my app Picfx. Using OpenGL for filters is much quicker, the only downside I've found is being limited by the texture size, I did set up a way to process images in tiles but ultimately decided to just limit images to the texture size. Great info on the colour space, I'm sure it will be useful.


Instead of fixing the horrible JPEG encoding can you please add support for webp? It's quite a bit smaller and well supported with polyfills since it's just a single vp8 frame


polyfills in general are a really awful user experience.

They are typically pushed by people who use the latest chrome, so they have an excuse not to care about other browsers.

Their preformance, and usability is almost invariably terrible.


You don't need a polyfill to deploy Webp. Chrome automatically sends webp in the Accept headers for images, so on the CDN level you could implement some logic to seamlessly swap in Webp images for the browsers that support it. Imgix does this for instance.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: