My colleague Melissa Santos and I wrote a piece for Model View Culture about perks and how they can be divisive, in spite of the best intentions of the people offering them. The article really cautions companies about building their culture or even their recruiting pitch around perks, it’s a dangerous shortcut to take.
Illustrated piece by Susie Cagle on the effect that the “sharing economy” (think Uber, Lyft, AirBnb, etc) is having on the overall economy. Here’s the bottom line:
The sharing economy has painfully noble goals. But a society and an economics that truly values civic engagement, the commons, and trust between people is one that invests in the protection of those people so they can really prosper, even when something goes wrong.
I would agree that in the presence of a strong social safety net, the sharing economy would look very different. As it is, it looks like a way for rich people to exchange the opportunity for workers to steal some crumbs from other workers in service of shrinking the overall pie.
Alexis Madrigal has written an interesting look at how Google’s self-driving car really works. Google has figured out that rather than building a really smart car, they could instead build a really rich digital representation of the roads on which the car will drive. Obviously there are big questions about whether this approach can be scaled to work for a larger geographic area than Mountain View, California, but I love this approach, which only software engineer would come up with. I agree with the article that collecting and storing large amounts of this kind of data is a problem we understand better than the problem of building really intelligent machines. If this approach to controlling self-driving cars takes off, it also puts Google in a great position to make money licensing data to any company that wants to build them, rather than building cars itself.
Brent Simmons posted a really nice recollection of his days at UserLand back in the late nineties. I was a Mac user at the time, and I was building sites using Frontier. It really was something completely new. The idea of having “Edit This Page” buttons on sites was pretty radical at the time, and eventually became a compulsory part of any decent content management system.
Search engine analyst Danny Sullivan applies his expertise to analyzing the penalties MetaFilter that might have cost the site a huge chunk of its Google search traffic. It’s a really interesting look at how Google penalizes sites that try to game the system.
The section on how Google penalizes publishers for inbound “unnatural links” is really interesting:
It’s insane because it has allowed the same sites that charged people to get links to now charge for them to be removed. Or for the rise of an entire link debuilding industry. Or for publishers who have long suffered terrible link requests to now get messages from people asking for links to be removed.
The article has some really good advice for Google in how it manages its relationship with publishers. I also liked this bit:
Earn respect. That’s your best defense if things go south with Google. It’s also your best offense for doing well in Google.
As Sullivan points out, people cheered when Google lit Demand Media on fire and set them adrift.
One of the big stories in my small world over the past week or so has been the layoffs at MetaFilter. Matt Haughey broke the news in a MetaTalk post on the state of the site. Here’s the bottom line:
While MetaFilter approaches 15 years of being alive and kicking, the overall website saw steady growth for the first 13 of those years. A year and a half ago, we woke up one day to see a 40% decrease in revenue and traffic to Ask MetaFilter, likely the result of ongoing Google index updates. We scoured the web and took advice of reducing ads in the hopes traffic would improve but it never really did, staying steady for several months and then periodically decreasing by smaller amounts over time.
The long-story-short is that the site’s revenue peaked in 2012, back when we hired additional moderators and brought our total staff up to eight people. Revenue has dropped considerably over the past 18 months, down to levels we last saw in 2007, back when there were only three staffers.
Today, he posted more details on Medium, both about the drop in revenue and Google’s recent classification of MetaFilter as a content farm. This has been happening to other reputable blogs as well. I haven’t gotten any of these requests, or if I have, they have gone unread.
I don’t really think of Google as a monopolist, but it is true that Google holds the fate of any number of Internet businesses in their hands. This is true whether they rely on Google-served ads for revenue, or they rely on organic search traffic from Google to grow their visitor base. I oftentimes tell people that Google is to Internet businesses what weather is to farmers. You can have fertile soil, plant the right crops, and run your farm incredibly well, but if it doesn’t rain, you’re not going to have anything to harvest in the end. By the same token, if Google makes a change that directs traffic away from your site, you’ll find yourself in the same situation as MetaFilter.
That’s scary. I don’t really have any solutions to propose, but the degree to which the Web publishing industry has become almost wholly dependent on Google demands more attention.
Walking on a high wire is inherently dangerous, but you’ve said that you prepare so much that death is not a risk.
No, it’s not. I think this is very wrong to put your life in danger or to gamble with your life. I cannot do the first step not being absolutely sure that I will successfully perform the last.
Relatively speaking, the risk assumed by Petit isn’t any greater than the risk you or I assume when crossing a busy street. The thrill is in the fact that for regular people, doing what Petit does would be impossibly risky. Preparation is everything.
I would put it this way: the fewer people use RSS, the better content providers can allow RSS to be.
But something else has happened over the past ten years; browsers got better. Their support for standards improved, and now there are evergreen browsers: automatically updating browsers, each version more capable and standards compliant than the last. With newer standards like HTML Imports, Object.observe, Promises, and HTML Templates I think it’s time to rethink the model of JS frameworks. There’s no need to invent yet another way to do something, just use HTML+CSS+JS.
Update: Read this excellent follow-up from Sam Ruby as well.
Tim Bray reviews the book of the year, Thomas Piketty’s Capital in the Twenty-First Century, and more importantly, rounds up the key reactions from around the Web. I’m very interested on the book, and I’m sure it will be on my “to read” list for a long time.