Skip to main content

About Gwern

Who am I online & what have I done? Contact information; sites I use; computers and software tools; things I’ve worked on; psychological profiles

This page is about me; for information about Gwern.net, see About This Website.

Personal

A transition from an author’s book to his conversation, is too often like an entrance into a large city, after a distant prospect. Remotely, we see nothing but spires of temples and turrets of palaces, and imagine it the residence of splendour, grandeur and magnificence; but when we have passed the gates, we find it perplexed with narrow passages, disgraced with despicable cottages, embarrassed with obstructions, and clouded with smoke.

Samuel Johnson; The Rambler, No. 14 (1750-05-05)1

Behind a remarkable scholar one often finds a mediocre man, and behind a mediocre artist, often, a remarkable man.

Friedrich Nietzsche, Beyond Good & Evil §137

The reader lives faster than life, the writer lives slower.

James Richardson, “Even More Aphorisms and Ten-Second Essays from Vectors 3.0”

Work

I am a freelance American writer & researcher. (To make ends meet, I have a Patreon, benefit from Bitcoin appreciation thanks to some old coins, and live frugally.) I have worked for, published in, or consulted for: Wired (2015), MIRI/SIAI2 (2012–201311ya), CFAR (201212ya), GiveWell (2017), the FBI (2016), Cool Tools (201311ya), Quantimodo (201311ya), New World Encyclopedia (200618ya), Bitcoin Weekly (201113ya), Mobify (2013–2014), Bellroy (2013–2014), Dominic Frisby (2014), and private clients (2009-); everything on Gwern.net should be considered my own viewpoint or writing unless otherwise specified by a representative or publication. I am currently not accepting new commissions.

Websites

“I don’t speak”, Bijaz said. “I operate a machine called language. It creaks and groans, but is mine own.”

Frank Herbert, Dune Messiah

I have no connection to the French singer or with gwern.com, any locations in Wales, the gwern on MySpace, or either account on Pivory.com (which are connected to an attempted extortion of me).

Wikis

I have been active on the English Wikipedia and related projects since January 200420ya. Cumulatively6, I have over 90,000 edits and have written or worked on hundreds of articles; during my time as an English administrator, I performed thousands of administrative actions; I am an admin on the Haskell wiki, handling routine spam & vandalism:

I also ran a custom Google search tool at “Wikipedia Reliable Sources for anime & manga”; this is a custom Google search with >4542 websites on its black and whitelists. (The source/lists are publicly available.) It returns much more useful7 results for topics in popular culture, and as the name suggests, anime & manga in particular.

Uses This

I’m sometimes asked about my tech “stack”, in the vein of “Uses This” or The Paris Review’s Writer At Work. I use FLOSS software with a text/CLI emphasis on a custom workstation designed for deep learning & reinforcement learning work, and an ergonomic home office with portrait-orientation monitor, Aeron chair, & trackball.

Software

I run Ubuntu Linux with a tiling window manager & CLI-centric habits. (I prefer Debian but the support of NVIDIA drivers has been better with Ubuntu, so as long as I need GPU acceleration, I will be using Ubuntu). I began using tiling window managers with ratpoison and helped drive the initial development of StumpWM and then xmonad (my config), which I still use in conjunction with MATE, a fork of the last good GNOME desktop environment version before the crazy GNOME 3 ruined everything.

I spend most of my time in Emacs editing Markdown (my config), Firefox (extensions: Evernote plugin, HTTPS Everywhere, NoScript, uBlock origin, LastPass, RECAP), or urxvt/Bash/screen. Most of my programming of R/Haskell/Python is done in a REPL+Emacs. (Friends don’t let friends use heroin or org-mode—are you ever really going to make back the time it takes to learn & customize org-mode?)

Miscellaneously: I use Mnemosyne for spaced repetition, Liferea for RSS, Evernote/NixNote for clippings/notes, rTorrent for downloads, mpv/Clementine for media playing, irssi for IRC, arbtt for time-tracking, ledger for finances & Google Calendar for scheduling/reminders, Redshift for screen tinting at night to help bedtimes, and duplicity for backups.

Hardware

Computer

As of June 2020, I use a workstation PC (which I built myself), a large Dell monitor mounted in portrait mode for reading8, a 200-foot Ethernet cable (which required I dig a trench to the next house), a Logitech thumb trackball, a G.SKILL KM360 mechanical keyboard9, and Bose noise-canceling earphones.

The workstation is plugged into a 900W UPS for protection against the not-infrequent lightning storms here, and a 6TB external drive for daily incremental backups, supplemented by Backblaze B2 (~$4/month) & miscellaneous external drives. While traveling, I use my ThinkPad P70 laptop, which replaced an Acer Aspire V17 (which died in a most unfortunate way), which replaced a Dell Studio 17, which replaced a PC I built ~2008.

I designed the workstation to be useful for deep learning, reinforcement learning, and Bayesian statistics, which made it much more expensive than I would’ve liked, settling on a Threadripper+dual-GPU design (while not forgetting that IO is often a bottleneck), but unfortunately those are fairly contradictory requirements (DRL wants RAM+CPU while DL wants just GPU), and the result wound up being expensive. (I went overboard on RAM in part because I was frustrated how I kept hitting RAM limits while testing out various dynamic programming algorithms for the Kelly coin-flip game, and because that much RAM means that entire datasets can be cached or worked with in-memory in R/Python, saving the considerable complexity of out-of-core algorithms or optimizations.)

The workstation is a liquid-cooled AMD Threadripper CPU build on a Gigabyte X399 Designare EX motherboard, 2×1080ti NVIDIA GPUs, 110GB RAM (nominally 128GB but final stick is unusable due to apparent BIOS issues), a 1TB NVMe drive for OS/home, and an 8TB internal HDD for bulk storage, all in a (unnecessary but too fun to not have) tempered-glass case. The process of putting it together was difficult—motherboards/CPUs/GPUs have gotten more complex since I last built a PC back in 2008—and the first motherboard stubbornly refused to boot, and after I RMA’d it to Newegg (at a cost of $36), the second one initially worked but then died overnight.10 After tinkering & procrastinating for months, I gave up on the Asus motherboard, checked what Puget Systems was using for their Threadripper builds (ThinkMate was still not offering any), and copied their choice of Gigabyte X399 Designare EX motherboards, reasoning that if they were shipping hundreds of such systems, it must be relatively reliable; that motherboard, plus much more forcefully inserting the Threadripper CPU, finally worked, and I was able to switch everything over in June 2018. While the final result was as powerful and useful as I hoped (especially for working with Danbooru2018, where the 16-cores+2-GPUs allows me to create many different specialized datasets & experiment with many different GAN architectures) the experience of building it has soured me on building my own PCs in the future: I clearly no longer know enough about PC hardware to do a good job, and the more expensive the components, the less I enjoy the risk or fact of bricking them. In the future I will probably either rely more on cloud solutions or bite the bullet & buy prebuilt systems. The workstation parts list (PCPartPicker.com sketch):

Other

For scanning books, I use a 12-inch guillotine paper cutter to debind books evenly (a big upgrade from using X-Acto knives with Fiskars curved blades), an Epson sheet-fed scanner with imagescan for scanning & gscan2pdf for post-processing.

My desk is an old desk made out of plywood & plumbing hardware by my great-grandfather for my aunt; I repurposed it when I realized it was the perfect size and height. In July 2020, because I failed to find any good standing desks I could buy used locally to test it out, I gave up and bought a 48x30 curved bamboo Jarvis standing desk ($609). I experimented with a treadmill desk but found it distracting, chronically unpleasant, and distressing to my cat. I put the desk in front of my bay window so I could enjoy the view and rest my eyes, while watching what happens on the river. The bay window unfortunately often has direct sunlight through it, so I added reflective sheeting, which greatly reduces the heat during the summer (at the cost of making it gloomier in winter, of course, but that is why I have bright LED bulbs). The chair is a used Aeron chair I bought off Craigslist for $281.39$2252016 in November 2016 (a bargain, although I doubt I would pay the list price); I replaced the mesh when it tore through in April 2021 for $60, which is much cheaper than the usual recommendation of replacing the entire seat-pan ($200). The sisal cat tree (Petco) provides an excellent perch for my cat, and I have added a pet flap with a cat window sill so he can more easily come & go, with acrylic sheeting to reduce air flow. (He turns out to greatly dislike soft surfaces, so half of the cat window sill was useless! I had to replace the foam padding & cover with a sheet of plywood I cut to fit.) The box fan by my feet (Walmart, $23.14$192017) & the workstation both rest on rubber-cork anti-vibration pads. To reduce RSI, I keep a grip exerciser around to use during idle moments like watching videos. For making tea, I boil water in a simple adjustable electric tea kettle which I’ve made ‘programmable’ by drilling a hole into the clear plastic & inserting a meat thermometer (which combination is far cheaper than electronic kettles and more trustworthy); I then steep the tea in a Finum filter inside a big Colonial Williamsburg ceramic fox mug.

My cat would like to remind you to take a typing & computer break every hour.

My cat would like to remind you to take a typing & computer break every hour.

Mailing Lists

MOOCs

Finished:

Incomplete:

Abandoned:

Profile

This section covers some of the most important things possible to know about me: my personality and mental description. No doubt some readers expected a carefully airbrushed & potted biography describing where & when I was raised, what my familial & tribal affiliations are, or what famous institutions I am affiliated with; even though this information is almost entirely useless—what can one predict about me if one knows that I was born in Illinois and raised on Long Island, but (maybe) my accent and a general liberalism? The irony—that people want most the information they will learn from least—will not be lost on those familiar with signaling. In contrast, standardized & validated psychometric instruments like the NEO-PI-R or RAPM really do have predictive validity for many life outcomes.

(Much of this data comes from YourMorals.org. I plan to retake the surveys, if possible, every decade; it will be interesting to see what changes.)

Personality

To describe my personality briefly: I am introverted, calm, neither particularly industrious nor lazy, contrary, and pathologically curious. I have made a copy of my 201132014 responses to the YourMorals.org corpus; discussed in more detail below. My scores on the “Big 5 Personality Inventory” (1/2/3):

  1. Openness to Experience11: high (short) or 87/87th percentile (long)

  2. Conscientiousness12: medium or 64/69th

  3. Extraversion13: low or 6/7th percentile

  4. Agreeableness14: medium-low or 3/3rd percentile

  5. Neuroticism15: medium-low or 16/13th percentile

For those who enjoy playing the game of ‘ad hominem via lay psychiatric diagnosis’, may I suggest not accusing me of Asperger syndrome—which is so overdone—but something more novel & scary-sounding like schizoid personality disorder?

Philosophy/morals

The relevant results

Politics

Contact

  • Email: gwern@gwern.net; I do not use Skype or Zoom.

  • PGP key (mirror; fingerprint: 0329E13129E08F19EDBA7250678AC516DD6A88CF)

-----BEGIN PGP PUBLIC KEY BLOCK-----

mQGNBGEs5RgBDADVRdbC0D0LfQwpQFvTNJLZX6s3Ew/Tk5LW8VgK32Z1zhQr6YdB
nm0/U/mlm32SsoqEY+u5O/zvfF9GNA7sbatZwTPFlO5+Inq/FNqMq+dht52TSqiz
I6Oya1ryuI7142nga90nQhXlHLkwkIZODdb6QYIcgIOrc0yI3bEDmFw5bVHXAMH5
J6rjADi27ZALZxLoU4L+u4vgtHOVBOWsuTOaXR4z0zTJ/Wi3Emi1EqZXu8Z4dTfq
YbVLqpL//z3B8ZZLc+1dY8lzacPXWulFhGN9yHeMz/vXgXdOSZfHMQYytqTL3Z3k
prv11QLSF3UaE4Fncm7FZkX2YjVP3vGghinZ3Uvv71bJdU3FbwM7KK6wbby2cBo0
9xaZlyYrv8e8MxqTa5LYJw+j84DOE6bQq2KyyJewjc0Q0vz+t43rNr5uwxD1DJuZ
90jb3ylvDPu6qMX/AVlgODXTmV+8GIKiJP0UytpDi549ZatjTJLTMHHySxytFONZ
jlcu0+88MBpMcSEAEQEAAbQfR3dlcm4gQnJhbndlbiA8Z3dlcm5AZ3dlcm4ubmV0
PokB1AQTAQoAPhYhBAMp4TEp4I8Z7bpyUGeKxRbdaojPBQJhLOUYAhsDBQkDwmcA
BQsJCAcCBhUKCQgLAgQWAgMBAh4BAheAAAoJEGeKxRbdaojP4tIMAIMkP60FS3Nm
HZbwk0t48zJDGLzpqK0UWnz+BxLRxJPOgZY7+0P53nfcpSFiwiG+cFB+06eaK5Hg
ttcGJd6NcLnAyHdHDqUOSF3Yf/8vIXIzggC9jt+mc2ug27GGHAT5asY4CNX8N5Ho
VH1cnCvrTMDINRrir8BX1puPnsfRIshG8cVaKW/hXpDHRvpq3v2UbMULHUBTLPBE
PDDH/O2VUG2hj267TIcM/dvaBwfC74R47Phkiis+nwcFr7OsYs3mIkCsZ4RNxYns
xuNkv1aNSFI6agmvBSfsoLg9Ch2Viw6CEfLpeAwzFH/DIljzHWJJ3JqCYYEEEGkI
eb7cAzfdCoR2iISSGR4GH1XNujaFSTXnuvX/vUsxdvmaxnwm2VkbNDLEwCAv8gxA
O/eA6JJGZl1jbQq/oUouYWqGmLjMBVVjR2r3Y9KNJMsLOT6xOXKwDWAsdNS6N1uv
d4B7ZyE16mPfpH0SB8uNpT155FSlvRGo9rWvAWIPZh3X9Ko3/aX8u7kBjQRhLOUY
AQwAycf9rk7GFRm7d7EE38GnUzwrkEWug4uczdRhknXN5rP4/a2aBOaUEX8WE7Lu
vyyiCWyI2qca3koVCzOt8DR3zUGhAAIZruHJmC8zWMSM9Bu4oLxmK1EvZ8zKpWKT
1vsfIwKd3NsjEFTWvC19BI1S5NRWRiK97Kx6IPGcchSRk8SV1dbLpMNFM/u1oUuZ
lS0DGydjvl+mcDHRL4Zk2Xkj3XcmJ8kQIAT0814UtuOSHvbcEkVU7fDs64MgE6Z1
6J9nUZYv6nN870N7QHBWmoKhipGEeNNimHZ4OdZuJdsSS19iHKYu9LVM68YMtBC2
FypgLs7BOmaiqtfE/NpIOeJGkxW8S7UQtzsZwzfLzD81k0fXZectsMhS1Cf4aLMR
vcWkucLHGW4QcvuUDOHCjtNPSX5MTHngI6qZOHQfVDV2bsc4Gge7TRtj8ghkhkDh
g3ElnFExJQkzwQcd2lTy3vXx4Ne1bxtcqNJIYiGZ+HCYaBvD3y2QgtckfxxBt1D5
MRu/ABEBAAGJAbwEGAEKACYWIQQDKeExKeCPGe26clBnisUW3WqIzwUCYSzlGAIb
DAUJA8JnAAAKCRBnisUW3WqIz7axC/9SDOtdsW/H2PezxDAEHasQvMJsS6/CxPL+
MvEEMx7GTQBUk2Ktz538lJhxUirw9UfixcxuIuaBwta/pDbeqqpHFg7gJbU+pvDQ
ROqgv84gsvUcAYfOIFFKfKmyE1NO+dSGmyDvCaUr8Z5xKWWwAg5MEHNDi0zJWV/F
lCRCb4Q6o7lkzHWljzzAAN+UHs2lq0089WMor/D7CXD5yQGUjcXGbLvoxJl5mDtQ
0TwRdGlF7RQDOSEvNro0eooed80im4bXW+ahhuZsylocVPIlgRoOjCkSNKQTnbd/
ysTzPhCdi59UYIvOuogRxGGBr/uDBcq53amZeVIsxHyT6v3njw0oGC8+2F1fPuSp
5A234E2+lNBmO1TZaM9SH5G0V0mDaV+35JS+gVWQ0R7qRix33+yXHcSCdpoIQTRj
wnuJtQ6JQ4ToVdZsGyfnn6N6fkTvYFx3bYl2EV5d30cb5oQpcd4QiFDXzZg2yjCE
p5IV9yhbHXXHKYyLpLwJN8KpBO4oM0s=
=1NJ8
-----END PGP PUBLIC KEY BLOCK-----

Collaboration Style

Once on #haskell, I was asked why I have no large programs to my credit; I replied, “My problem is that most programs I use already exist.”

I am not a bad Haskell programmer (although I am no guru like Simon Peyton-Jones, Apfelmus, or Don Stewart), but given how long I’ve been using Haskell, my contributions probably look pretty slim. This isn’t because I don’t like Haskell—I do, I find functional programming natural: defining transformation after transformation until the result is what I need. And of the functional languages, Haskell seems the best combination of power beyond basic arithmetic or list processing, one of the best ecosystems, and good basic language. (Which is not to say it’s perfect: there are some sharp edges in the basic math which irritate me when I’m messing around in the REPL.)

This is partly because of my style of contribution. I’ve always preferred to work on existing applications and libraries than to go write my own. I’ve always preferred to take someone else’s work and bring it up to snuff than write a clean implementation of my own. I’ve always preferred prodding the author or maintainer to do the right thing than to drop a large batch of patches onto them. Likewise, I view it as better to use Haskell standards like Cabal or Darcs than to use something like Autotools even if the latter lets us manage just a little more automation. I view it as better to upload to Hackage than to use any fancy site like Github or Sourceforge.

It’s better to do yeoman’s work taking two similar modules in two applications and split them out to a library than to write even the fanciest purely functional finger tree using monoids. Better to commit changes that reduce user configs by a line than to demonstrate once again the elegance of monads. Better by far to file a bug than wank around in #haskell golfing expressions.

It is much better to find some people who have tried in the past to solve a problem and bring them together to solve it, than to solve it yourself—even if it means being a footnote (or less) in the announcement. What’s important is that it got done, and people will be using it. Not the credit. It is a high accomplishment indeed to factor out a bit of functionality into a library and make every possible user actually use it. Would that more Haskellers had this mindset! Indeed, would that more people in general had this mindset; as it is, people have bad habits of repeatedly failing when they think they have special information, are highly overconfident even in objective areas with quick feedback, and badly overestimate how many good ideas they can come up with16—indeed, most good ideas are Not Invented Here. One should be able to draw upon the wisdom of others.

This is an ethos I learned working with the inclusionists of Wikipedia. No code is so bad that it contains no good; the most valuable code is that used by other code; credit is less important than work; a steady stream of small trivial improvements is better than occasional massive edits.

A leader is best when people barely know that he exists, not so good when people obey and acclaim him, worst when they despise him. Fail to honor people, They fail to honor you. But of a good leader, who talks little, when his work is done, his aims fulfilled, they will all say, ‘We did this ourselves.’17

This is not an ethos calculated to impress. Filing bug reports, helping newbies, commenting on articles and code, cabalizing & uploading code—these are things hard to evaluate or take credit for. They are useful, useful indeed (shepheb or, eg. myself, never boast in #xmonad of having helped 5 newbies today, but over the months and years, this friendliness and ready aid is of greater value than any module in all of XMonadContrib.) but they will never impress an interviewer or earn a fellowship. Is that too bad? Did I waste all my time?

I don’t think so. I value my contributions, and the Haskell community is better for it. It may have made my life a little more difficult—all that time spent on Haskell matters is time I did not devote to classes or jobs or what-have-you—but ultimately they did help somebody. One could do worse things with one’s time than that.

Coding Contributions

I mostly contribute to projects in Haskell, my favorite language; I have contributed to non-Haskell projects such as StumpWM, Mnemosyne, GNU Emacs18 etc. but not in major ways, so I do not list them here. After starting this website, I wound down my regular coding activities in favor of my writings; when I code, now it tends to be tools documented or hosted on this website (eg. Archiving URLs, Resorter) or integrated into writeups (eg. Generating Anime Faces with StyleGAN). For that code, you can browse by language tag: C/CSS/Haskell/JS/Python/R/Scheme/shell.

Below is a more detailed list of my old Haskell contributions, most of which is now of only historical interest.

Haskell

  • arbtt

    • wrote tutorial on configuring the time-tracker & defining rules: “Effective Use of arbtt”

    • documented dependencies, similar software, configuration syntax mode, CLI flag corrections

  • Darcs

    1. Switched from FastPackedStrings to ByteStrings

    2. Low-level C optimization

    3. Initiated Cabalization (my work initially appeared as darcs-cabalized and then was merged into HEAD and darcs-cabalized deprecated)

    4. Refactoring of shell tests

    5. Initiated switch from MoinMoin wiki to Gitit

    6. Identified performance issue & instigated addition of --max-count option for Filestore

  • XMonad

    1. regular XMonadContrib patch reviews

    2. Config archive downloader

    3. Contributed modules:

      1. XMonad.Util.Paste

      2. XMonad.Actions.Search

      3. XMonad.Actions.WindowGo

      4. XMonad.Util.XSelection

    4. Maintained previous19

  • Yi

    1. Contributed modules:

      1. Yi.IReader

      2. Yi.Mode.IReader

      3. Yi.Hoogle

    2. Improved Emacs keybindings

    3. Initiated ‘Unicodify’ or ‘Pretty Lambdas’ feature for Haskell syntax highlighting

    4. Added movement-related functions for improved incremental search

    5. Cleanup20

    6. Comment support to cabal-mode

  • Lambdabot

    1. (Re)Cabalized21

    2. Adapted to use Mueval

    3. Refactored out code in multiple packages:

      1. show

      2. lambdabot-utils

      3. brainfuck

      4. unlambda

    4. Implemented run-in-any-directory functionality (previously Lambdabot could only run in the repository directory)

    5. Cleanup

    6. Maintained it (with Cale Gibbard)

  • Gitit

    • Wrote Darcs backend (which was moved to the filestore package and became Data.FileStore.Darcs)

    • Did some optimization work (images, JavaScript & CSS minification, wrote gzip encoding & initiated expire headers, JS relocation, fewer calls to expensive filestore functions)

    • Wrote RSS support

    • Wrote Interwiki plugin

    • Wrote Date plugin

    • Wrote WebArchiver & WebArchiverBot plugins (see later archiver standalone tool/library)

    • Wrote Unicode plugin

    • Wrote HCAR entry

    • Misc. bug reports & suggestions

    • Added PDF export functionality

    • Integrated JQuery-based floating footnotes

  • Filestore

    • Instigated its development/use in Gitit & Orchid

    • Maintained the Darcs backend (debug & optimize)

  • archiver: Wrote and maintain it (see release ANN)

  • Mueval: Wrote it

  • wp-archivebot: Wrote it (see release ANN)

  • Change-monger: Wrote it

  • Base

  • Unix: fixed a possible runtime crash in mkstemp; added mkstemp docs

  • Autoproc

    • Cleanup

    • Improved basic functionality

    • Implemented an XMonad-style reload system to allow actual customization

    • Maintained it

  • Frag

    • Updated for GHC 6.8 & 6.1022

    • Cleanup

    • Replaced the non-Free level data and graphics with Free ones

  • Hint

    • Improved examples, docs

    • Added UTF8 support

    • Made use ghc-paths library

    • Enabled QuickCheck support

    • Added GHC-options support

  • Hlint: added GHCi integration

  • Pugs

    • Cleaned up their third-party modules

    • Fixed up various Cabal issues

    • Helped maintain it

  • QuickCheck: Data.Complex instance

  • Tagsoup: replaced old custom HTTP download code with standard library functions

  • Hashell: Updated for 6.8’s GHC API; Cleanup; Cabalized

Cabalization

As part of my effort to help shift the Haskell community to the use of centralized packaging repositories pioneered by CPAN, which is a fundamental requirement for any modern language, I made a systematic effort to get all extant Haskell code into Cabal format & uploaded to Hackage—whether the original authors wanted it or not. (For all the ruffled feathers and continued infelicities of Haskell packaging, a decade later, no Haskeller would go back to the pre-Cabal/Hackage Autotools days.) I cabalized and/or uploaded (according to the 2013-05-10 Hackage upload log):


  1. This is a literary way of saying I am not as interesting as my writings, and in some respect, it should not matter who I am or what I have done because argument screens off authority.↩︎

  2. When I say “research assistant”, I mean it in the older sense of someone who does detail work for another person’s original research—so I spent a lot of time reading up on specific areas and making notes about stuff my boss needs, and only occasionally do independent work. Not all my work can be made public, but some of it is. A partial list in rough chronological order:

    ↩︎
  3. The following is a list of my submissions to LW I regard as substantive or particularly good, excluding content which can be found on Gwern.net, in chronological order with interesting ones highlighted:

    ↩︎
  4. Of course, I don’t agree with every MIRI or LW position. The intellectual homogeneity has been much over-estimated by outsiders who have not bothered to look at the annual surveys, I think. Here are some major points for me:

    1. MWI: I think that LWers who were persuaded by Eliezer’s MWI writings are wrong to do so, as they are unfamiliar with even the rudiments of any alternatives interpretations and cannot judge in the matter; how many LWers have ever seriously looked at all the competing theories, or could even name many alternatives? (“Collapse, MWI, uh…”), much less could discuss why they dislike pilot waves or whatever. Lacking any real understanding, they ought to simply adopt the expert consensus, where MWI seems to have a plurality or bare majority of adherents (with the weak confidence that implies).

    2. Heuristics and cognitive biases: I am not much convinced that knowledge of heuristics & biases help in ordinary life. Feedback & learning are powerful tools in eliminating error, calibrating predictions, and justify committing what may look like the sunk cost fallacy; and feedback is what one gets in ordinary life.

      Per Moravec’s paradox, where our knowledge of heuristics & biases will pay off most is in what Hanson would call “Far” scenarios: evolutionary novel situations with few precedents and only costly or non-existent feedback. (For example, the question of whether artificial intelligence will be developed by 2040: it will only happen or not once, there are few comparable events, the consequences may be dramatic, and our ordinary lives offer no useful insights.) As it happens, this describes much of futurism & forecasting but we cannot justify our futurism by claiming its techniques are incredibly valuable in ordinary life!

    3. Cryonics girl: The donations appall me, for reasons I lay out at length there—they are a complete abandonment of core ideas like utilitarianism & optimal philanthropy.

    4. Alicorn’s “Living Luminously” paradigm struck me as dubious, not backed by even token research, and likely idiosyncratic to her; I thought her Luminosity e-novel was merely OK despite the endless discussions on LW (rivaling those for Methods of Rationality itself) and that her followup, Radiance, was just terrible. Nevertheless, her novel career seems to continue.

    ↩︎
  5. There is a moderately funny story about how Gerard came to write it, based on my musical incompetence.↩︎

  6. That is, summing up the (surviving) edits of my various accounts over the years: User:Gwern, User:Marudubshinki, & User:Rhwawn↩︎

  7. Compare the CSE results with the Google Results for the anime Wings of Honnêamise. Which is more useful for an editor? For more details, see my release announcement.↩︎

  8. A trick I discovered when visiting FHI in 2015—I had used widescreen laptops for so long I had forgotten how nice portrait-orientation was for reading.↩︎

  9. I had a Kinesis Advantage keyboard, but struggled with the keymapping & large physical size making it difficult to find a comfortable desk & chair height which left my arms high enough to use the thick Kinesis. In July 2020, I switched to the split ergonomic mechanical keyboard Ergodox EZ ($325), but that wound up having similar issues.

    At this point, I began getting frustrated with the time & money I was spending dabbling in exotic keyboards—the biggest problem with the generic keyboard was the switches, and that the ten-key island took up a lot of space & made it hard to reach for the trackball. So in September 2022, I looked through a keyboard search site until I found a cheap $50 thin ten-key-less keyboard with good low-travel mechanical switches (to help forestall RSI from heavy keypresses) from G.Skill, and called it a day. It is thin enough to fit my posture, didn’t require painful relearning of decades of muscle memory, and has been satisfactory.↩︎

  10. My best guess is that my problem initially was that I seriously underestimated how much pressure it takes to insert a Threadripper CPU into its socket—it required a truly terrifying amount of force and I only got it right after triple-checking online tutorials & videos & discussions—and that was why the first motherboard never worked at all, and the second one was killed by static electricity or a short.↩︎

  11. See also “Actively Open-Minded Thinking Scale”, “Clarity Scale”, “Engagement with Beauty”, & “a measure of what types of stories you enjoy”.↩︎

  12. See also “Zimbardo Time Perspective Inventory”. Brent W. Roberts criticizes these two inventories when used to measure Conscientiousness.↩︎

  13. See also “Relational Mobility scale”, “Empathizing and Systemizing scales” & “Rational vs Experiential Inventory”.↩︎

  14. See also “Self-Report Psychopathy Scale”.↩︎

  15. See also “Experience in Purchasing Behavior Scale” & “Kentucky Inventory of Mindfulness Skills”.↩︎

  16. For further reading on overconfidence, see all LW articles so tagged. I once read in a book of a study in which subjects were asked to generate ideas for, IIRC, putting out a fire, and to stop only when they were convinced they had thought up all good ones, and usually stopping when they had thought up only a third; but I have been unable to refind it and would appreciate knowing details if this description rings any bells for a reader.↩︎

  17. Chapter 17, Tao Teh Ching↩︎

  18. For example, my clean-up and extension of the browse-url module was completely rewritten by RMS; so I can hardly take credit there.↩︎

  19. Henceforth, this implies I have a commit-bit (or equivalent) for that project.↩︎

  20. Henceforth, ‘cleanup’ should be taken as referring to extensive miscellaneous changes which include (in no particular order):

    • fixing GHC’s -Wall or hlint warnings

    • replacing OPTION pragmas with LANGUAGE pragmas

    • tracking down licensing information

    • switching from Haskell98 imports to the standard hierarchical module imports

      1. eg. import Charimport Data.Char; nontrivial in some cases where Haskell98 modules were dispersed over multiple base modules

    • reorganizing the file tree

    • improving the Cabalization

    • whitespace formatting, and so on.

    ↩︎
  21. Henceforth, this typically implies that I uploaded it to Hackage as well↩︎

  22. Henceforth, this implies that I made whatever changes necessary to get it compiling on GHC 6.8.x and 6.10.x↩︎

Similar Links

[Similar links by topic]