We need to archive feeds

As of 2015, Facebook has overtaken Google as the main source of traffic to news sites[1]. The average Facebook user spends 40 minutes a day on the site[2]. If we fail to archive news feeds we will lose important sociological information.

The Internet Archive does a great job of archiving the web, among other things like VHS tapes. They’re located here in The Richmond and you can go visit them on Fridays during lunch[3]. I’m concerned that while they are doing a great job of getting the data, they’re missing really crucial metadata, i.e. feed presentation.

The Wayback Machine is necessary but not sufficient for a full representation of the state of the web as it is experienced by real people. If you really want to understand the zeitgeist of today, January 17, 2017, you wouldn’t want to only see the front page of the NYT. You’d want to experience the web mediated through news feeds of real people. I imagine a good number of researchers would like to sample feeds in the years leading up to the 2016 presidential election.

I see there is a chrome extension that can be used to archive a feed, but I haven’t seen anyone scraping feeds at scale over time. Imagine a sort of opt-in system where volunteers give archivists read-access to their social media accounts. With a few hundred opt-ins you could get a decent sample of all the different cultural/subcultural bubbles.

The obvious candidate to solve this problem is Facebook itself. They might already have a feed time machine internally! Contributing a few hundred feeds (and appropriate developer time) to the Internet Archive would be a great move and a PR win.

 

[1] http://www.adweek.com/socialtimes/facebook-is-now-the-top-referral-source-for-digital-publishers/625300

[2]http://time.com/3950525/facebook-news-feed-algorithm/

[3] If you live in San Francisco you should go do this. You will have fun and meet some interesting people. Also, I wish private tech companies would have explicitly open lunches. It could work really well in SF. “Hmm, should I have lunch at Indeed or Slack today?”

Blacker Mirrors

The question of whether or not we’re living in a simulation has cropped up in public fora recently. The thrust of the argument is that future societies are going to love computers and love simulating stuff.

a technologically mature “posthuman” civilization would have enormous computing power; if even a tiny percentage of them were to run “ancestor simulations”… the total number of simulated ancestors, or “Sims”, in the universe … would greatly exceed the total number of actual ancestors. [1]

So if we’re in a simulation it’s most likely being run by a far-future posthuman civilization. That’s totally fine by me. I’m not going to lose any simulated sleep over it.

But consider a similar, but very low probability scenario: near-future simulation.

Suppose that Moore’s law makes computing power more abundant over the next hundred years (or some pre-post-human time-frame), but simulating a universe is still extremely resource intensive. Simulations run on custom-built hardware and are only within reach for major governments and a few hedge funds. The number of simulated realities is low enough (<100) that there is active monitoring. The administrators aren’t interested in individuals, but the behavior of societies at-large. They want to win wars, craft policy, and beat the market. Like an applied version of Foundation’s psychohistory. This scenario is a little depressing, but doesn’t bother me too much more than the far-future hypothesis. It’s depressing, but not invasive.

The much more concerning scenario is that our simulation is owned not by BlackRock, but by a really well-capitalized Viacom. Our simulation is scoured by editors for juicy tidbits that are edited into compelling reality TV segments, a la The Truman Show. In this scenario, the most embarrassing internal monologues of every person on earth are played over laugh tracks for millions of fans in the base reality.

I am less concerned about free will and determinism than I am about being laughed at by someone higher up in the simulation chain.

[1] https://en.wikipedia.org/wiki/Simulation_hypothesis#Bostrom.27s_trilemma:_.22the_simulation_argument.22
note: Why is there a photo of Nick Bostrom on that page? What value does that add?

Shot on a Nexus 5X

Google has advertised the Nexus 5X around San Francisco, claiming a high quality camera. Here’s what the photos look like in practice.

Conclusions

  • Good for close-ups of people and dogs.
  • Good for landmarks in bright light.
  • Okay for landmarks in low light.
  • Bad for far away objects.
  • Bad in low light for candid shots of people. They move too much.
  • Okay for landscapes; best on clear days.

Setting up the Launchpad MSP430 development environment on OSX

A series of stale wikis and forum posts complicate the setup for cross-compiling code on OSX Yosemite 10.10.4 and running it on a MSP430G2231 microcontroller. It’s not hard if you know what to do.

Today, the fastest way to get code running on an MSP430 is Energia, an Arduino-like environment with a specialty language and purpose-built IDE. It’s actively maintained and seems to just work which is more than can be said for anything else in this ecosystem.

If you want to write vanilla C with no magic you need to set up the toolchain and debugger yourself.

1. Install mspdebug with homebrew.

brew install mspdebug

If you launch mspdebug you won’t find any devices because you’re missing the usb driver.

2. Install the MSP430LPCDC-1.0.3b-Signed driver from the Energia site.

A restart is required. Afterward, try sudo mspdebug rf2500 and you should be able to get a shell and see output like this:

Trying to open interface 1 on 006
Initializing FET...
FET protocol version is 30066536
Set Vcc: 3000 mV
Configured for Spy-Bi-Wire
Device ID: 0xf201
 Code start address: 0xf800
 Code size : 2048 byte = 2 kb
 RAM start address: 0x200
 RAM end address: 0x27f
 RAM size : 128 byte = 0 kb
Device: F20x2_G2x2x_G2x3x
Number of breakpoints: 2
fet: FET returned NAK
warning: device does not support power profiling
Chip ID data: f2 01 02

3. Install the pre-built open source gcc compiler from TI.

Download the msp430-gcc-opensource compiler for OSX.

You’ll have to create an account and swear you’re not Libyan, among other things. Install the compiler using the wizard and remember the destination path.

4. Create a project folder with some source code. There are some examples in <GCC_INSTALL_DIR>/examples/osx.

I tweaked one of the examples to work specifically for the MSP430G2231. Put them in the same folder.
blink.c
Makefile

5. Build the project with make. This should run some commands like this:

~/ti/gcc/bin/msp430-elf-gcc -I ~/ti/gcc/include -mmcu=msp430g2231 -O2 -g   -c -o blink.o blink.c
~/ti/gcc/bin/msp430-elf-gcc -I ~/ti/gcc/include -mmcu=msp430g2231 -O2 -g -L ~/ti/gcc/include blink.o -o msp430g2231.out

6. Copy the compiled program to the board.

sudo mspdebug rf2500
prog msp430g2231.out

7. Run the program with run. The red and green LEDs should alternate blinking.

Bonus fact: since the MSP430 has memory-mapped I/O, you can turn on the LEDs from the debugger with the mw command. Consult the appropriate header file in <GCC_INSTALL_DIR>/include to find the address of P1. For me, it’s 0x0021.

How to fix a Sennheiser PC 151 Headset

I bought this headset in 2011. A few weeks ago the mic started to cut out on conference calls. Adjusting the volume causes some crackling sounds and toggling the mic switch yields an unpleasant pop for the listening party. These symptoms suggest poor connections and isolation in the control unit. Let’s chuck it.

Here’s most of what you’ll need.

soldering supplies

  1. Cut out the control unit and throw it away.

Sennheiser pc 151 volume control unit

  1. Strip the wires carefully. It’s tricky to do this without damaging the internal copper wires.
  2. Burn the enamel off of the 5 internal wires with a lighter. Even the copper-colored wire has enamel on it.
  3. Solder the like-colored wires together. There are 5 in all: Copper, white, red, green, and red-green. It helps to have flux and a little helping-hands tool.
  4. You can check the connections with a multimeter on the continuity testing setting.
  5. Tape it up. I didn’t have any electrical tape but I found some surgical tape in a first-aid kit. It works fine, but is incredibly ugly.
  6. Test it out by calling the Skype test number, echo123.

Special thanks to my roommate, Aaron Cake, who let me use all his gear.