Hacker Newsnew | past | comments | ask | show | jobs | submit | mothcamp's commentslogin

I imagine it's a company similar to Bulletin Intelligence. Would you be open to discussing your experiences in this industry?


yeh sure! how'd you like to do so?


Awesome! To protect your privacy on HN, please email [email protected] and let me know whether you prefer getting on a call or keep things in writing. Looking forward to hearing from you!


Slack: https://api.slack.com/docs

In addition to being thorough, it has personality.


I’ll second this one. They don’t try to be cute or too funny, it’s just written in a casual tone. My only (fairly minor) problem is that it’s not always clear where to start. Once I’m in the right general place, I’m pulled along really well but, because there is so much that’s possible, it feels like they need a better overall getting started place.

Still, grade A work over there on docs!


Literally came here to say this. Great docs.


The vibes I got were dystopian.

I positioned myself as someone seeking help, and the bot came across like an HR person play-acting empathy and care, and couldn't wait for me to leave the office. It came down to (1) do your research, (2) make a list, (3) take courses, (4) be aware of barriers.


https://www.youtube.com/@nitin_punjabi/videos

Just a place to host my NLP course. Don't know whether I'll do anything further.


Thanks. It's all statically-generated pages with NextJS and Tailwind.


It's for anyone who wants to learn NLP such that they get (a) an understanding of what's going on under the hood and (b) knowledge of how to get stuff done.

So the ideal outcome is someone who gets an end-to-end view from theory/concept to implementation.

If someone just wants to learn how to use tools/frameworks, I'd stick to the Colab notebooks. If someone's already experienced in ML and wants to learn something NLP-specific, I'd skip around to see what's interesting.


Thanks. Excellent course.


You could start by looking into either multitask transformers or really general seq2seq models like T5. With T5, for example, it just learns to transform one text sequence into another. So you could fine-tune T5 to produce your target sequence, but rather than outputting an explicit Python list of tuples, it would output a string that looks like a sequence of tuples.

Or maybe skip all that and outsource it to GPT: https://imgur.com/a/BQv6C3K


Ah, so if the model is just converting input text into output text, it can really learn how to do just about anything? But, there may be certain aspects of model design that make it better at some types of conversions ("tasks") than others? And there may be certain data sets that you want to train a base model on to get base learning of such as general language comprehension, and then build on top of that for your specific use case?


Yeah, I can see that being the case for specialized domains. With state-of-the-art models widely available to the public, knowledge of the domain and its workflows, and fine-tuning models to suit the domain will probably be your edge.


It is kind of like a very opaque but trainable Turing machine.


Your time. That's it.


Really appreciate that. Finding that balance was one of the hardest parts of building this course.


Yes, it's easy to see you put a lot of thought into that. I hope your course receives much more exposure. When I first found your videos a few weeks ago, I was surprised how few views they have given to the quality of the course.

Do you record the voice track of your videos yourself?

Glad to see you published the final lesson about transformers. Was looking forward to that!


I did record all voice tracks, yeah. If I do this again, I'll probably use a lot of generative tools now. :-D

Hope you find the transformers module useful!


That's impressive. The audio track of your videos is so clean and well understandable that I was wondering if you used a studio setup or voice synthesis software. Well done!


Will do. Thank you.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: