(U.S.) Senate Bean Soup (recipe)

Bean soup is on the menu in Senate restaurants every day. There are several stories about the origin of that mandate, but none have been corroborated.

FastAPI / Typer

FastAPI is a modern, fast (high-performance), web framework for building APIs with Python 3.6+ based on standard Python type hints.

Typer is a library for building CLI applications that users will love using and developers will love creating. Based on Python 3.6+ type hints.

Edward Kmett - Cadenza Building Fast Functional Languages Fast

In this talk Ed will give live coding introduction to normalization by evaluation. He will then show how Graal and Truffle, on the JVM, can be (ab)used to JIT functional languages. He discussesd why this seems like a promising direction for evaluating dependently typed languages in particular.

Implementing Bitonic Merge Sort in Vulkan Compute

An interactive review of Oklab | Raph Levien’s blog

Kurt Vonnegut interviews Robert Caro

OCaml: What are some libraries you almost always use? - Learning - OCaml

Dbunzli stuff:

  • Ptime / Erratique: Ptime has platform independent POSIX time support in pure OCaml . It provides a type to represent a well-defined range of POSIX timestamps with picosecond precision, conversion with date-time values, conversion with RFC 3339 timestamps and pretty printing to a human-readable, locale-independent representation.
  • Mtime / Erratique: Mtime has platform independent support for monotonic wall-clock time in pure OCaml . This time increases monotonically and is not subject to operating system calendar time adjustments. The library has types to represent nanosecond precision timestamps and time spans.
  • GitHub - mjambon/cmdliner-cheatsheet: Cheatsheet for adding command-line options to an OCaml program using cmdliner
  • Bos / Erratique: Bos provides support for basic and robust interaction with the operating system in OCaml. It has functions to access the process environment, parse command line arguments, interact with the file system and run command line programs.
  • Fpath / Erratique: Fpath is an OCaml module for handling file system paths with POSIX and Windows conventions. Fpath processes paths without accessing the file system and is independent from any system library.
  • Logs / Erratique: Logs provides a logging infrastructure for OCaml . Logging is performed on sources whose reporting level can be set independently. Log message report is decoupled from logging and is handled by a reporter.

Atkinson Hyperlegible Font

Free font, designed to be legible for low-vision readers.

Who Gains and Who Loses from Credit Card Payments? Theory and Calibrations

Merchant fees and reward programs generate an implicit monetary transfer to credit card users from non-card (or “cash”) users because merchants generally do not set differential prices for card users to recoup the costs of fees and rewards. On average, each cash-using household pays $149 to card-using households and each card-using household receives $1,133 from cash users every year. Because credit card spending and rewards are positively correlated with household income, the payment instrument transfer also induces a regressive transfer from low-income to high-income households in general. On average, and after accounting for rewards paid to households by banks, the lowest-income household ($20,000 or less annually) pays $21 and the highest-income household ($150,000 or more annually) receives $750 every year. We build and calibrate a model of consumer payment choice to compute the effects of merchant fees and card rewards on consumer welfare. Reducing merchant fees and card rewards would likely increase consumer welfare.

Extrapolating GPT-N performance - AI Alignment Forum

On benchmark performance, GPT-3 seems to be in line with performance predicted by smaller sizes, and doesn’t seem to particularly break or accelerate the trend. Close-to-optimal performance on these benchmarks seems like it’s at least ~3 orders of magnitude compute away (costing around $1B at current prices).

Age of Invention: The Paradox of Progress

While the total amount of food has increased dramatically for the past few hundred years, for example, agriculture’s share of the economy steadily fell (from over 40% of the English economy in 1600, and an even greater share of total employment, to less than 1% today, with a similar trend repeated worldwide). Innovation has, in a sense, been the victim of its own success. By creating ever more products, sprouting new industries, and diversifying them into myriad specialisms, we have shrunk the impact that any single improvement can have.

Reflections on 2020 as an independent researcher | Andy Matuschak

A “huge project” for a Silicon Valley tech person may be a year or two long; a “huge project” for a researcher may last a decade. Persistence with a difficult problem may require tens of hours for a tech person and hundreds (or thousands) of hours for a researcher, no matter how quickly try to work. It’s not that the tech people are constitutionally lazy or something like that: in industry, it usually is, in fact, a bad idea to spend many hundreds of hours thinking about a single problem. Better to create an 80/20 solution or try a different approach. But foundational insights often do require more patient, focused thought than heuristics from tech culture would naturally encourage. Living here has changed me deeply. Probably other places would have had a similar (or a better?) effect on similar axes. For example, I like what conversation in Cambridge, MA does to my state of mind. But when I lived in Portland, OR, for example, the environment tended to emphasize a different set of values—community, craft, sustainability, enjoyment. I liked these values, too, but I suspect they would not so naturally reinforce my current work.

Geoengineering – The Tipping Point (John Baez)

Is public opinion about to shift on geoengineering (into the Overton window)?

What are the most important statistical ideas of the past 50 years?

/We argue that the most important statistical ideas of the past half century are: counterfactual causal inference, bootstrapping and simulation-based inference, overparameterized models and regularization, multilevel models, generic computation algorithms, adaptive decision analysis, robust inference, and exploratory data analysis. We discuss common features of these ideas, how they relate to modern computing and big data, and how they might be developed and extended in future decades. The goal of this article is to provoke thought and discussion regarding the larger themes of research in statistics and data science./

Understanding middlebrow (Scott Sumner)

Interesting take on the SSC / NYT kerfuffle.

Tweets

Could these numbers possibly be accurate? Even if not, I’m sure they’re directionally true and those in the top decile drink a shocking amount.