Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Use GitHub Actions to Make Your GitHub Profile Dynamic (bengreenberg.dev)
195 points by mooreds on April 10, 2023 | hide | past | favorite | 49 comments


I've seen something like this being used to show the last X commits in a readme and I hate to be _that_ guy, but do we really need scripts that spin up a VM and install python/ruby/whatever multiple times a day just to potentially update a readme that only a few people will see?

Doing this once a week, like the article mentions, should be good enough for most people. It's not _that_ critical information.


I do think it's good practice to enable caching, such that your script doesn't hit RubyGems / pip / npm / etc every time it runs.

That way at least the automation activity stays entirely within the GitHub / Azure network.

It looks like you can do that for Ruby by adding this:

https://github.com/actions/cache/blob/master/examples.md#rub...

    - uses: ruby/setup-ruby@v1
      with:
        ruby-version: 3.1
        bundler-cache: true


If you want it to be very efficient, pick a language like Rust, Go, Nim, maybe even Crystal or Haskell, which can produce a (mostly) self-contained executable, without installing anything. Write the logic for updating your profile with it, all while honing your efficient coding skills.

The few people who look at your GitHub profile may be some important people, like your every future prospective employer. It may make sense to impress them a little bit.


Ruby pays the bills and is nice to work with.


I think Ruby is fine for the task.

But if anybody wanted to show off the ability to run these updates excessively quickly and economically, there is a number of ways.


insert cocaine joke


I would even have both repositories on GitHub and see about triggering a workflow in my username repository to run, so it only runs when it needs to.

EDIT

https://blog.marcnuri.com/triggering-github-actions-across-d...

Or you use something like this, build your blog in CI, get the top X posts in its CI workflow, and send it as JSON payload to the GitHub repository with the README you want to upload.


I'd guess a single search on Google probably consumes more J than this


In 2011 a search on Google cost roughly 0.0003 kWh[1] (Although I suspect it is more energy efficient now).

According to [2], a small VM uses around 7.5 Wh for 30 minutes (for a blended load of active/idle if I understand it correctly), or 0.00023 kWh per minute. So in very rough numbers with a lot of assumptions if your process takes less than 1:30 to install and run your program you are probably comparable to a Google search. More than that and you are almost certainly less energy efficient.

[1] https://www.nytimes.com/2011/09/09/technology/google-details...

[2] https://www.researchgate.net/figure/Mean-Energy-Consumption-...


I've been running my profile like this for nearly three years now - it works great! https://simonwillison.net/2020/Jul/10/self-updating-profile-...

https://github.com/simonw is my profile.

https://github.com/simonw/simonw/commits/main shows 1,490 commits updating my profile so far, almost all of them from the GitHub Actions automation.


Your cron schedule is currently 3 times an hour. So this script ran 78840 times in three years to generate 1490 commits. So less than two percent of scheduled actions triggered a commit.

Wonder if web hooks or event driven execution might be more efficient.


Webhooks are tricky because I'm pulling content from three separate sources - my blog (a classic Django app), my TIL website (content that lives in a GitHub repository) and my project releases (gathered from the GitHub GraphQL API).

The releases one is particularly hard - I'd have to add hooks to all of my 194 repositories that use the GitHub releases feature, and remember to add the same hook to future repos as well.

I'm careful to use caching for the pip dependencies (to avoid re-installing them from PyPI every 20 minutes) and to only hit either GitHub APIs (where I figure they can handle traffic from their own Actions) or APIs that I myself control - it doesn't feel rude to me to hit my own blog's RSS feed three times an hour as part of this.


> The releases one is particularly hard - I'd have to add hooks to all of my 194 repositories that use the GitHub releases feature, and remember to add the same hook to future repos as well.

If you set it up as a GitHub App you'd be able to add it as an always-installed app to your personal org, then everything (including new repos) would send events?

Or at the very least it'd be a central place to have all the events and config and as you add a new project you'd just need to install it onto that new repo


TIL. Thank you.


Based on the current updates, it seems like once a day would keep your profile updated.


Your updates may be more prolific than many...


A much easier way to do this (or at least to get started) is with https://github.com/muesli/markscribe - you just have to write a Go template readme.md.tpl, and run the Action on a cron and it will render it to readme.md.

The article makes it seem like there's something special about .github/scripts - but using that is if anything a bad idea, it's not special, but maybe it will be (at GitHub's discretion) in the future. There's two good points to make: 1) you can have a profile readme; 2) you might like to update it via a cron Action. But don't put your script there and all the waffle about TS and Ruby and whatever is beside the point.


Mine is a fish tank. https://github.com/Krutonium


Does it update regularly?


No, but no reason it couldn't ;P


Cool, was just wondering if you were regenerating it at regular intervals or something


Awesome :D


The only thing that's missing is a custom repeated background image like it's myspace


If only we could get a blink tag and some techno music playing full blast.


<embed src="children.midi" hidden="true" autostart="true" loop="true">

Now I need to go listen to some Robert Miles again.


You're forgetting a marquee for the latest updates


That's still under construction, which we'll need another marquee or images to let everyone know about.


I have a job that runs every night and updates the statistics in my profile.

* https://github.com/tedivm/

* https://github.com/tedivm/tedivm/blob/main/.github/workflows...


You can also make your profile readme executable scripts, since shell scripts can be more or less valid markdown. See mine: https://github.com/egosown


Very cool. Literate programming is always an interesting exercise. In my undergrad I developed a little Julia macro library for turning my code with LaTeX encoded comments into a LaTeX document with code blocks, since the relevant source code needed to be explained as a central part of my thesis.

P.S., nice profile. Just finished reading the Landstreicher translation of The Unique and Its Property last night.


Indeed, I think literate programming might see a resurgence with the help of LLM tooling. Both Copilot and ChatGPT already have some ability with it and in the former case it helps with the loss of autocomplete you typically get when doing literate programming.


The autocomplete problem is pretty easily solved by implementing custom language servers that just dispatch to two existing language servers depending on context. I was half way through doing that before realizing that the entire project was a massive yak-shave off of the thesis I needed to finish.

I do think that it should be considered best practice to leave the prompt that generated some functional code inline in a comment or so forth. It acts as a marker letting the reader know "this is what the code was intended to do". If I find some weird code on a bug hunt, I need to spend time triaging it between "weird code required because of weird intent", "weird code that works, but a simpler solution exists", and "weird code that contains the bug I'm looking for". Context helps, and if you're writing the context anyway, there's no reason to exclude it from the source.


nvim allows more than one language server to be attached to buffer...


In the 1990s I would spend countless hours per week on my .plan file, waiting for the day it would be read


So cool! I had never heard of the 'finger' cmd or .plan files. It led me to discover John Carmack's .plan file (https://fabiensanglard.net/fd_proxy/doom3/pdfs/johnc-plan_19...).

Does anyone know of any other interesting ones?

Edit: I found this thread from 2021 (https://news.ycombinator.com/item?id=29248368)


Yeah I just have a chart of potholes filled in Raleigh via python, https://github.com/apwheele. (So doesn't update the readme.md itself, just updates the PNG the readme points to.)

Good idea to add in a `timeout-minutes: 15` or something like that for the job (in case scraper code can hang). I agree it is cheeky, but I enjoy watching things I built just work over time.


I've loved doing this myself - https://gitlab.com/jamietanna https://github.com/jamietanna - which pulls information from my website and displays information through the machine parseable Microformats2 standard.

Looking back I did it over a year ago! (https://www.jvt.me/posts/2022/01/12/autogenerated-profile-re...) I keep meaning to rewrite it because it's a set of _very_ rough Go which was some of my first Go I wrote


So is this why you've got like an average of 40 commits/day 7 days a week?


No in that case, it's because things like publishing to my website count as a commit, which compounds with the fact that most days I work on at least one side project


It would be more easy if your blog would offer an rss-feed ... but thank you for the idea.


Yeah I was kind of surprised to see the HTML scraping going on; if provide an RSS/ATOM feed and this task is a lot simpler, and doesn't need the overhead of Ruby (or any programming language) either, can easily be done with a bash script.


Woah, I’ve been actively using GitHub for over a decade and I had no idea it had profile pages like this if you make a repo that is the same as your username. Is this a new thing? How did I completely miss the bus on this?


Relatively new, the feature rolled around in 2020. It uses yourname/yourname as a special repo.


This kind of activity is exactly what I hate about Github and the new and unimproved "open source community".

No one cares you updated your vimrc file. Nobody wants to see another "awesome list" in a code repository. Nobody cares about that cool project you started over a weekend and haven't touched in three years.

Your commits are not important.

We have become so narcissistic. We are now coding with the kardashians.


You can also make UIs by chopping up an image and making some of them links. Check out my profile, the buttons are clickable: https://github.com/veggiedefender


Anyone wants memes on their profile? https://github.com/Bhupesh-V


Chat GPT sing me the praises like a medieval bard would sing upon a dragon slaying knight, using this list of commits and contributions.


This became popular 3 years ago when I made my profile auto-update with my latest blog posts [1]. Fun stuff. Others had way more creative ideas.

[1] https://darekkay.com/blog/github-profile-readme/


excited to see the next leap forward in github becoming myspace for devs




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: