China is getting closer to tech independence by the day, I imagine they are happy to sell their tech to anyone who is willing. Not saying this is good or desirable from a European perspective, but quite likely.
This is from April last year. I haven’t seen this angle of the Greenland saga mentioned a lot recently. Is it a coincidence that Trump started to talk about Greenland again right after he and Musk became friendly again?
I was reading a lot of technical books and kept highlighting things I wanted to remember — but I rarely went back to review them. The notes just sat there, on my Kindle or in the reading app.
So I started building something simple: a tool that lets me turn highlights into flashcards with as little friction as possible.
Just select text on your iPhone, share it with the app, and it creates a flashcard using AI — a Q&A pair and a short summary. You can browse cards in the app, or show them on your Home Screen, Lock Screen, or watchface of your Apple Watch.
This is my first iOS app, and building it has been a great learning experience. I’m using Supabase for the backend which have been mostly great.
I really like this. I'm going to grab some of my kiddos 7th grade biology notes (handwritten), and see what it does. I did notice that's it's a little hard to manage your deck: removing and shuffling are not obvious?
Let me know how it goes! I'm not always very happy with how the cards turn out, will see what I can do with some LLM prompt tuning.
Yeah, it's very barebones at the moment, if you swipe the name of a deck left on the list of decks, you should be able to delete it, and I haven't added the ability to delete individual cards yet. Shuffling is not possible, adding it to the backlog.
Given that it's summer-break and school is not the top of their thoughts, they're actually pretty thrilled. Here's what my youngest did: she took her hand-copied notes (from the book), took a photo with her camera, highlighted her notes, and had it created cards. She did this (rapidly) for a whole page of notes, and then had a card-deck to drill from.
I'm really glad to hear that! Two things I've thought about that would probably make the process even smoother:
- Create cards directly from camera input in the app.
- Create multiple cards from a source text, so that she can just supply a text body and the llm figures out what is important, and then the user can just discard cards they do not find useful.
You could... ? Or, it can just be what it is: addressable from "share". That has the nice property that if she gets a copy of notes from friends, or the internet, it's not "built in": it uses the functionality already in iOS?
I’m sure the algorithm can play a huge role in the effectiveness of learning but for me the difficult part was always creating the cards and actually opening the app to practice.
I've built Komihåg [1] to try and combat this: Select any text on your iOS device and a flashcard is automatically created for you, and the app is then showing you the cards on the Home Screen / Lock Screen / Apple Watch Face.
I haven't gotten to implement any sophisticated scheduling algorithm yet but will definitely do that eventually.
Cool idea. I’ll check it out. It would be cool to do something like this for single word highlights on an e reader. If I have highlighted a singe word it is because I want to add it to my vocabulary.
Your app looks cool! I've tried a few other apps doing something things, Clearspace is the one I'm using now. Will give yours a try!
I'm in a similar situation as you (developer having to do marketing) but have not gotten as far, so far I've only posted on a few subreddits and here on HN. Have you found any nice learning resources?
You select some text with your phone and share it with my app, then the shared text is reformulated to a flashcard (with the help of a llm).
You can then browse your flashcards in the app, but I’m also working on ways to show the cards to you with less friction: Like on the phone lock screen or on the face of your watch.
R1 and O1 points towards a world where training models will be a small bucket of the overall compute. That doesn't mean the total amount of AI compute will stop accelerating, just that interconnected mega clusters is not the only or most efficient way to run a majority of future workloads. That should be negative news for the company that is currently the only one that is capable of making chips for these clusters, and positive news for the players that can run inference on a single chip, as they will be able to grab more parts of the compute pie.