First, let me just say that this seems great. It looks like a perfect way to use reasonable defaults (project name, Version) and use the existing `dotnet publish` infrastructure to make containers. And I love how the blog post has both a simple CLI example, and a GitHub Actions yaml example! So thank you.
Now for the problem:
I still don't understand why other people compile dotnet projects in containers. Today, we have a many containers built on a monolith, and it looks like if I make containers via `-p:PublishProfile=DefaultContainer`--for example, 20 containers--then that CI build is going to compile our codebase 20 separate times. With `-p:PublishProfile=DefaultContainer`, the long build is mostly duplicated in each container. Right?
So I have one major problem preventing me from adopting this: it's compiling in the container, which balloons our build time.
It's entirely possible I'm missing something obvious or misinterpreting the situation, and if so, please let me know. I'm mostly immune to feeling shame and appreciate feedback.
There is some benefit to building inside a container - it keeps your build environment consistent across team members and makes it easier to replicate your CI.
Having said that, because the .Net toolchain is capable of cross-targeting this feature should enable broad swaths of users to not need to build inside a container to get a container created. So I completely agree with your puzzlement here and would hope that this feature leads to a reduction in that particular pattern.
> it keeps your build environment consistent across team members
I have never had .NET build issues due to environment inconsistencies across team members. I think NuGet is pretty good at making the dependencies consistent. No need for containers.
I personally appreciate the ability to build on any machine. A newly setup dev machine, or a new build machine, without having to worry about if I all the various dependencies installed for a successful build. Not all of my build dependencies can be handled with nuget.
This is like opening a time capsule and examining all the little details. Almost more interesting than the barebones, locked-in-time product page for a paid C compiler, is the locked-in-time cgi-bin-powered shopping cart page at http://www.safepay.net/cgi-bin/shop/cart.cgi?db=products.dat... (note it's an http link).
There's a whole host of old-school relics here--it is truly a blast from the past: paying separately for a debugger, paying for specific libraries, videos available on DVD.
It's really hard to express how deeply internet access has changed the landscape of ... well, everything, but in this particular case, the programming ecosystem. gcc probably existed back when I bought this compiler, and probably so did Borland's excellent pre-internet-era IDE, but I didn't know that. And so I bought this one.
Having to pay for a compiler sounds a little funny through the lens of 2022; I wonder if the majority of people buying these software/packages at the time found the idea strange, too.
It was just the way things were back then. Compilers were expensive too. In the late 80s I saved up my allowance money for months to buy a Modula-2 compiler for my Atari ST. I still remember reading the manual on my way home from the city, floppies in hand.
In the 80s there was a vibrant shareware & public domain scene, but 'open source' wasn't nearly what it is today, and shareware & PD things were mostly utilities, etc. That really took off in the early 90s with the advent of Linux, the Internet, FTP sites. Some of the GNU stuff existed in the late 80s but was mainly only of use to academics until Linux came on the scene.
The upside of the way things were is that lots of people seemed to make somewhat reasonable livings as individual businesses selling software they'd made. People who would be sinking their time into open source projects now were often sinking their time into software that they sold by mail order or through user groups, etc.
The other thing I remember about 80's compilers is a lot of the vendors wanted to drink your milkshake too. They had licensing fees for every unit you shipped. Basically meaning they wanted a cut of your gross revenue.
To be fair, this particular compiler was relatively cheap. The number £30 springs to mind (which included the book), but I'm not sure if that was the price or just my faulty memory. The internet wasn't a thing for home users in 1989.
Only a couple of years later I had access to the internet and was downloading DJGPP. Fun fact is I now work with DJ.
Yeah, I noticed that, too. $20 for a compiler seems ridiculously cheap, even for 30-40 years ago prices. I would have thought it would be more like $200.
DJ is a well-known compiler developer at Red Hat who is also working on Arm and RISC-V support (https://www.delorie.com/users/dj/). I work at Red Hat on Arm & RISC-V (amongst a few other things).
Not at all. At the time, only people at universities and major tech companies had access to the net and what little free software existed.
The Borland and Microsoft compilers costs hundreds. Power C was a godsend to impoverished students, and as noted by others, the book alone was worth the price they were charging for the whole package. It was just outstanding.
While I did have access to the net, it was only over a 1200 bps dialup modem. Downloading something like gcc over that took...a while. The alternative was to bring a stack of floppy disks to school, and laboriously split anything big into chunks that would fit on a floppy.
I still remember when a friend of mine who worked for the university computer labs took the trouble of downloading all the floppies (> 20, IIRC) that let you work your way up to having a running version of this weird Finnish thing called "Linux" (the term "distro" didn't exist then). I got him to make me a set, and never looked back.
Even if you did have gcc, at that time it didn't support a lot of the stuff you needed to make professional-level MS-DOS software -- none of the graphics functions to build what passed for a decent UI at the time, no memory models (which sucked, big-time, but you needed to use and understand them), etc.
I still count Power C to be among the wisest purchases I ever made, along with a copy of K&R 1st edition. Between those two, you were golden.
This concludes this edition of crochety grandfather talking about "In my day". :-)
I bought it at the time, somewhere in the 1990's. No internet and no cell phones at that time. Buying a box with floppies or cd was normal, even for an OS or basic tooling. Open source was out of reach, I didn't even know about the existence of the concept.
It came with a thick book, the reference manual for all C functions. That book alone was worth paying for.
Mix C was not that great, it miscompiled stuff on a regular base. I debugged a problem for a day before finding out the compiler sometimes flat out ignored basic constructions like i++
One day I found out about djgpp, and even it cost me as much as Mix C in phone costs, I never looked back. Quality was so much better. Still used that reference book a long time, though.
> Having to pay for a compiler sounds a little funny through the lens of 2022; I wonder if the majority of people buying these software/packages at the time found the idea strange, too.
Nope. I have, right here on my desk somewhere[1], the CDROM for the Watcom C/C++ compiler that came with an IDE and the watcom assembler wasm. IIRC correctly, the IDE had a Vi mode and it came with a make that was much better than the nmake from Microsoft.
I remember buying it for a relatively large sum back in 1996 or so. I did not think it strange to pay money for a C and C++ compiler + assembler that allowed me to produce Windowed applications, device drivers and netware modules, that came with an IDE (with Vi-compatible bindings), as well as make.
There was tons and tons of documentation as well (Windows help files), more than I'd ever seen before in my life. It had enough documentation on that disk to take you from "Never used C++ before" to "expert C++ developer". It assumed that google and stackoverlflow did not exist, and so it answered any question you could possibly have had.
It also had samples for all the major things, so you could easily start a device driver project (for example) just by copying the samples.
Honestly, it seemed like great value for money to someone who had no internet.
> I wonder if the majority of people buying these software/packages at the time found the idea strange, too.
Not at all. Pretty much all software at the time was commercial or, at most, shareware. Pre-web, selling binaries to people to run on their computers was just how one made money as a software developer.
That professional tools in particular were fairly expensive software packages (CodeWarrior, a few years later, was several hundred dollars. Power C was dirt cheap at $20) seemed completely normal. A carpenter isn’t handed a full workshop worth of saws and chisels gratis, after all. If I wanted to be paid to make software, just as obviously the professionals who made the compiler did, too.
(it’s really difficult to convey how incredibly good all of the documentation and examples that came with some of these products were, too. Think C (back when symantec sold compilers and wasn’t a fourth-rate antivirus vendor) came with thousands of pages of physical manuals teaching you everything from the fundamentals of programming to exhaustively documenting their libraries, with wall charts of class hierarchies etc. Pre-internet this stuff was worth it’s weight in gold)
Paying for compilers was very normal. In the early 80's I paid for the Realia COBOL compiler for MSDOS and paid for a screen library, ScreenIO, that generated full-screen (80 chars x 25 lines) applications. I had already written a film scheduling app for a client on a minicomputer and they were getting charged for time on the mini. Buying the PC and paying me $5K + annual maintenance was going to be cheaper than paying for minicomputer access.
I had to buy a dev PC: a PC-AT 286 with a toggle switch on the back that allowed it to run a 6MHz or 8MHz; a 60MB Priam hard drive - one of the fastest available; and I think it had 2MB of RAM, though of course you could only use 640K with the rest usable as a RAM disk or disk cache. I think it was around $2500. For reference, IBM's PC-AT system at the time (1984) was $4K-6K with a 20MB drive:
The Realia compiler was something like $1200 and the screen library was around $450 I think. They're apparently still selling ScreenIO, though now for Windows:
Computer Associates bought the Realia company, abandoned the product, and focused on porting mainframe COBOL applications to the PC.
I didn't make much on the initial deal, but charged $500/year for maintenance and they ran the thing for over 15 years. My main goal was to get a fast PC: I had been reading Byte and PC Magazine for a few months and this was a way to get one without me plunking down a lot of cash.
I think it's kinda cool that a small company like Realia (out of Chicago) could create something like this and have a functioning company for many years with paid employees. Yeah, open source is great in some areas, but in some ways it is not so great. A few geniuses could not have a sustainable compiler business these days, no matter how great the software.
Turbo Pascal was a bargain at $49 when it was introduced, and remained a bargain as the delay loops were removed from the compiler (according to rumor), and features were added. Delphi at $200 was still a good deal... then the price shot through the roof.
Oh, and it had a REALLY good manual and online help.
> Delphi at $200 was still a good deal... then the price shot through the roof.
The cheapest non-upgrade version of Delphi was at $100 (you can find it on archive.org) and was that until Delphi 5 or so i think. Later they made that version to be for personal use only and (i think) upped the price for the next bracket and after that they made the personal version free - but only of Delphi 7. After that everything went crazy expensive, until Turbo Delphi, which IIRC had two versions, one free that allowed commercial use but didn't allow installing custom components and a "cheap" (compared to the other versions) $500 version that allowed it, but it was based on the most unstable version of Delphi ever and CodeGear killed it anyway. In recent times they have the "Starter"/"Community" free edition but the license requires that you only have something like $5k / year income, otherwise you must buy at least the next bracket that is around $1700 or so.
Funny how when the entire programming tools industry was going towards more accessible tools, Borland/CodeGear/Embarcadero went the complete opposite direction.
That is what reduced it to its current niche, and made many of us suck it up and move to the less capable Microsoft tooling (MFC vs OWL/VCL, really?).
They decided to go after the enterprise market with enterprise prices, and with it killed the indie devs that were loyal to Borland and their great tooling.
It was also a key point that finally triggered Anders to accept the calls from ex-Borland colleagues working at Microsoft.
I don't think so. What I remember is that compilers and other developer tools were expensive and that was normal. Real "professional" tools but also beginner stuff like Visual Basic. Though as a teenager I remember pirating them.
Free software and open source did a lot to change that. GCC for example. Linux becoming popular also helped. Then as interpreted languages became popular, with perl, python, etc., all free and under permissive licenses. Java was free for personal use then with commercial licenses IIRC? Even Microsoft started having "express" editions or compiler-only without IDE. I'd say by the 2000s decade compilers were no longer a cash cow.
I remember reading a blog post years ago (don't remember where or by whom, unfortunately) that claimed the compiler market was also eroded from the other end - many small-ish companies building compilers were acqui-hired by large companies trying to improve the performance of their RDBMS, so the compilers those companies produced often ended up as roadkill, or if lucky, were open-sourced (think OpenWatcom).
But maybe it just turned out the real money is in the tools - you can get all of Microsoft's compilers for free, but they still charge you big $$$ for Visual Studio, and lots of developers apparently are happy to pay that price. Intel still charge big bucks for their compiler, but I have no idea how widely used it actually is or what Intel's thinking is.
I seem to recall that the free compilers from MS have (or used to have) limitations in the license about commercial use. If you use it for a popular product they may want money from you.
I was around in that era and having to pay for assemblers, compilers, etc... didn't sound strange at all. What was also a reality was the difficulty of actually getting them if you were far away from the US (Spain in my case). The world was way bigger back then in so many ways.
Maybe not for the compiler, but you pay though your nose for almost everything else!
Look at how Microsoft is pushing everyone to the Cloud, including desktops. All their developer products are wholly focused on pushing your workloads to Azure.
Because the only way to make money selling software to devs in 2022 seems to go back into timesharing days, with phosphor terminals and X terms now replaced by the browser and cloud shells.
Any old timer can map those cloud prices to the oldie mainframes workloads accounting.
A while ago I watched a video on YouTube about a guy reviving an old rs6000 workstation from ibm, running aix, with the ultimate goal of running doom.
When it talks about compilers, the c/c++ compiler suite was price at like 4 k$… although supposed/allegedly (that is, according to ibm marketing) it won over 4x over gcc (in the performance of the generated code i guess).
This was very common. I remember asking my parents to buy me a C compiler for Christmas, back in the early 90's. (Lattice C on the Amiga. It later became SAS/C!)
I think it was at least half a dozen floppies. It also came with a huge set of documentation.
Around the mid 90's I bought the Dice C compiler for the Amiga. I think it came with a book(?) I was really happy with the price. It was on a special markdown price. I counted it as one of the best purchases I ever made. I was doing a PhD in Applied Mathematics at the time, nearly all in Fortran. But I was interested in C. It turned out that me learning C was a great decision as it led to me being a professional programmer.
I think the Amiga collapse not too long after I bought the compiler. Maybe the declining sales of the Amiga is what the authors decide to sell the compiler relatively cheaply. Matt Dillon wrote the compiler, and subsequently went on to create DragonFly BSD.
It was not strange. Working with niche compilers was normal and high prices were expected. Even as a student I shelled out what would have been pretty significant money at the time for Borland C++.
Back in 1992 I bought Turbo Pascal for Windows 1.5, followed by Turbo C++ for Windows 3.1, for around what would be 100 euros in today's money, by making use of their student prices, also available to highschool students.
I hat to get the money from different kinds of gigs while at high school.
Still, it is goes with your point, as the minimum wage would be around 300 euros when converted to today's money, if I get the proportions right.
However that is because I wanted to have the real deal, boxes, manuals and so forth, on a street bazaar those floppies would have been more accessible, as it was still the days when piracy reigned, and there wasn't any government agency checking on software licences compliance.
This raises a host of other questions. Does Mix Software still exist? Their product catalog does not appear to be updated in the last ~20 years (owing to a Y2K compliance statement prominent on the front page), so unless they have a side business not listed on their website, it seems unlikely they’re still in business. Surely nobody’s still buying their compilers or training materials on VHS.
If they’re gone, who’s still paying the bill for hosting, and why? Just for fun to preserve a time capsule? Did they prepay for hosting for 30 years, and the site will disappear one day?
Checking Texas's CPA system, it seems like the company is (formally) active. The domain is registered at GoDaddy and it points to an IP in DigitalOcean's space.
If I had to assume, they probably still have some limited sales, support and consulting (ex. schools with teachers used to old software? companies with unusual development environments?).
While looking for Mix Software, Inc., though, I found their official(?) YouTube channel with the introductions for their C [0] and C++ [1] courses uploaded 8 years ago. They definitely have that old 90's VHS feeling to them, complete with music, early 3D animations and everything.
It would seem that the MIX C compiler for CP/M 80 got preserved at some point, too! [2] I'd need to look into it, but this looks promising.
A reasonable high-enough DPI option I've been watching for a while is a 55" 8K TV. We're close to the 8K utopia!
- HDMI 2.1 can push 8K @ 60Hz over a single cable (though I am told the colors are not full range? https://twitter.com/MaratTanalin/status/1426726300585185284 - "Afaik, 8K@60Hz via HDMI 2.1 is only possible with either DSC compression or chroma subsampling — both are lossy.")
- A 3090 can push the pixels
- 8K TV prices have come down to ~$1500-$2000
- In theory, some combination of Game Mode and/or manually turning off all forms of picture post-processing can get you into good-to-good-enough input lag
If you want to wait for monitor manufacturers to manufacture 8K monitors, well, while it's inevitable, it'll be a while. But for the bold and stout of heart, for those willing to take risks, a new adventure awaits you. 8K
I hope people don't actually believe PowerShell is NIH.
PowerShell had good reasons to exist in 2006, when it was released to the public. LINQ didn't exist yet, most of the "Windows scripting" took place via VBScript files and WMI, and dinosaurs roamed the earth.
So PowerShell was created to be a "scripting language for Windows".
PowerShell has a lot of great features specifically created for use in the shell, and some ok features (like easy COM integration) that were great in 2006. If you don't know what those are, I guess you could draw the conclusion that it's just another .NET language.
Here's a glimpse of what they have to say about the Atlantic article:
> Author credentials are fair game if you’re going to editorialize, so let’s check hers: an intern until 2010, moved down from global editor to staff writer after 10 months in the higher position, wrote about home design and architecture, and listed her most recent accomplishment on LinkedIn as, “Talk about beards on the radio.” Nothing makes me angrier than people who’ve never spent a day working in either IT or healthcare blasting out their entirely unqualified opinions in passing themselves off as authoritative.
Anyway, read the HISTalk article for an insider take.
I wouldn't. It's not a terrible language to work with on the job, but I don't think I've gotten anything out of it in the same way that I did from learning OCaml or Lisp.
Ditto, there's not a whole lot of innovation of ideas in MUMPS at all. It's just a random programming language that's good enough to get things done with and the database that's next to it is nicer than most and supports both NoSQL and SQL. Spend the time on learning Python, JavaScript and SQL I'd say :P
You are applying your YC bias to the word 'incubator'.
in·cu·ba·tor/ˈinkyəˌbātər/
Noun:
1. An enclosed apparatus providing a controlled environment for the care of premature babies.
2. An apparatus used to hatch eggs or grow microorganisms under controlled conditions.
Thank you. Outlook 2007 and 2010 have a) flags, b) user-taggable-color-coded categories, c) an extremely usable ToDo bar, and d) Xobni and a million other plugins that attempt to prioritize your incoming mail automatically.
I'm not saying Outlook works for me (it doesn't), but it's annoying that the author pretended Outlook doesn't exist, and essentially proposed his version of Outlook as a solution to the "problem".
Now for the problem:
I still don't understand why other people compile dotnet projects in containers. Today, we have a many containers built on a monolith, and it looks like if I make containers via `-p:PublishProfile=DefaultContainer`--for example, 20 containers--then that CI build is going to compile our codebase 20 separate times. With `-p:PublishProfile=DefaultContainer`, the long build is mostly duplicated in each container. Right?
So I have one major problem preventing me from adopting this: it's compiling in the container, which balloons our build time.
It's entirely possible I'm missing something obvious or misinterpreting the situation, and if so, please let me know. I'm mostly immune to feeling shame and appreciate feedback.