New programmers often need small projects to work on as they hone their skills. Exercises in courses are too small, and donât leave much room for self-direction or extending to follow the interests of the student. âRealâ projects, either in the open-source world or at work, tend to be overwhelming and come with real-world constraints that prevent experimentation and pure coding practice.
Kindling projects are meant to fill this gap: simple enough that a new learner can take them on, but with possibilities for extension and creativity. Large enough that there isnât one right answer, but designed to be hacked on by a learner simply to flex their muscles.
I used to use a ray tracer as my go-to kindling project. This is a good source of ideas for fun projects.
In our most recent paper, we show that language models are few-shot learners even if they have far less than 175B parameters. Our method (combining PET and ALBERT) performs similar to GPT-3 on SuperGLUE after training on 32 examples with just 0.1% of its parameter count:Â https://arxiv.org/abs/2009.07118
Seems a bit misleading: whereâs the language generation?