Kindling projects

New programmers often need small projects to work on as they hone their skills. Exercises in courses are too small, and don’t leave much room for self-direction or extending to follow the interests of the student. “Real” projects, either in the open-source world or at work, tend to be overwhelming and come with real-world constraints that prevent experimentation and pure coding practice.

Kindling projects are meant to fill this gap: simple enough that a new learner can take them on, but with possibilities for extension and creativity. Large enough that there isn’t one right answer, but designed to be hacked on by a learner simply to flex their muscles.

I used to use a ray tracer as my go-to kindling project. This is a good source of ideas for fun projects.

Matching GPT-3’s performance with just 0.1% of its parameters

In our most recent paper, we show that language models are few-shot learners even if they have far less than 175B parameters. Our method (combining PET and ALBERT) performs similar to GPT-3 on SuperGLUE after training on 32 examples with just 0.1% of its parameter count:

Seems a bit misleading: where’s the language generation?

- Some interesting tidbits about the White House Physician

  • ‘Reagan’s doctor called the job “vastly overrated, boring and not medically challenging”. He couldn’t attend state dinners due to lack of space but had to wait in his office wearing a tuxedo!’
  • “The White House has a much larger medical staff than you’d think necessary: 5 military doctors, 5 PAs, 5 nurses, 3 paramedics (and 3 admins and 1 IT manager).”
  • and lots more interesting tidbits