DIAMOND SPONSORS
America Online, Inc.
hakia
PLATINUM SPONSORS
Fast Search
IBM
Yahoo!
GOLD SPONSORS
About.com
Adobe Systems, Inc.
Ask
Etelos
Fenwick & West LLP
Level 3 Communications
Microsoft Windows Live
SILVER SPONSORS
Accenture
Cambrian House
Foldera
Fox Interactive Media
Genius
Google
HCL Technologies
Laszlo Systems
Nokia
Radialpoint
Six Apart Ltd.
Sun Microsystems
Yoono
LAUNCH PAD SPONSORS
Mayfield Venture
Norwest Venture Partners
Polaris Venture Partners
BRONZE SPONSORS
Amazon Web Services
Answerbag.com
eSnips
Intel
Intel Capital
Intel Software Partner Program
InternetRealEstate.com
MyDecide
Omniture
ThinkFree
MEDIA SPONSORS
Fast Company
Federated Media
Information Week
Online Publishers Association
Red Herring
Topix.net
Wired Magazine

SPONSOR OPPORTUNITIES

Reach business leaders and technology influencers at the Web 2.0 Summit. Call Susan Young at 415-947-6107 or email

Download the 2006 Web 2.0 Sponsor/Exhibitor Prospectus (PDF).

News & Coverage Request and invitation
 
 

Announcements, articles, blogs, photos, and podcasts about the Web 2.0 Summit.


oDesk/O'Reilly Tech Visualizations

by on Aug 23, 2007   Benchmark Capital funded oDesk operates an online project marketplace for hiring and managing remote technical staff. It is free to post a job. They have 5,000 developers in their network. oDesk made two announcements today at the Web 2.0... read more

Web 2.0 Principles and Best Practices

by on Aug 23, 2007  Ever since I wrote What is Web 2.0? on the eve of last year's Web 2.0 Conference, people have been asking for more. I've given dozens of talks around the world for companies increasingly far from the tech world. (In... read more

Real Sharing vs. Fake Sharing

by on Aug 23, 2007  In a recent brainstorming session about Web 2.0, I made the observation that "harnessing collective intelligence" is the pattern that opened the Web 2.0 era, but that "Data is the Intel Inside" is the pattern that will bring it to... read more

O'Reilly Radar: Deconstructing Databases

by on Aug 23, 2007  

Dale Dougherty writes:

At EuroOSCON, Greg Stein of Google gave a talk about the open source software development tools offered for developers at Google Code and I came away with an unexpected insight into Web 2.0.
read more

Deconstructing Databases

by dale on Sep 21, 2006  By dale

At EuroOSCON, Greg Stein of Google gave a talk about the open source software development tools offered for developers at Google Code and I came away with an unexpected insight into Web 2.0.

In describing the new bug tracking system, he said, that while he liked many existing bug systems, he realized there was an opportunity to redesign a new, much simpler bug tracking system for Google Code. The key he said was understanding that they had great full-text search tools available. That made them think differently about how to collect and organize the information in the bug "database." He believed that existing systems spent too much time deciding how to structure data entry and presenting a detailed form for users to fill out. They also then lock down the display of the information. He decided to keep structured data entry to a minimum and rely on text entry. A lot happens with labels/tags/keywords, for instance, to assign priority. The new bug submission form consisted of a text area with a few questions already inside it.

Greg makes a terrific point that could be applied more broadly to business applications, and might even be a design approach for Web 2.0 applications. A whole lot of effort goes into defining and refining the database structure behind most business apps. What is carefully placed in one bucket (or category or grouping) is not found when you look in another bucket. What if powerful full-text search tools change all that? What if instead of describing in detail the many specific fields of a record that might be important, and then having to train users on what they actually mean, and when to use them, you sidestep those tedious tasks and encourage users to write text. And write freely. The more the better. My hunch is that unstructured data can be richer and easier to collect than highly structured data, and therefore more valuable.

It would be an interesting exercise to look at overdesigned business applications and consider how they might be designed to look less like a database and more like a conversation.

read more

Web 2.0 Expo Call For Partipcation

by brady on Sep 20, 2006  By brady

webexpo_logo.gif

Every year there are more sites, jobs and tools in the Web 2.0 space than we can fit in the Web 2.0 Conference. There is demand for detailed looks at the technology and ideas of Web 2.0. To fill this demand O'Reilly and CMP are going to hold our first Web 2.0 Expo, happening April 15-18, 2007 at Moscone West in San Francisco, California. I will be co-chairing the conference along with Dave McClure and Jennifer Pahlka.

We are going to focus on the tools and infrastructure and concepts that build these new sites. What is the right web application framework? How do I build energy and buzz about my product? How do I use AJAX to make my site more useable rather than less? How can I scale my site? The Web 2.0 Expo will answer these questions. If you know the answers (or even the right questions), we want your help. Get in touch via our Call For Participation. The deadline for proposals is October 30th and though a video isn't required we certainly will factor them in to our decision.

Click-thru for more info on the conference

read more

GeoTracing

by brady on Sep 14, 2006  By brady

It's great to see so many Open Source GIS web platforms being presented at FOSS4G. In addition to GeoBliki, I saw a demo of Geotracing, a platform for GPS traces on a map in realtime. Just van den Broecke, the creator, is an avid inline speed skater and he initially built a system to help he and other skaters find, rate, and share skating routes in his native Holland (GeoSkating) using Google Maps. As he started to realize all of the other possibilities he genericized the platform to become GeoTracing. A hosted version, complete with user tracks, can be found at TraceLand. From Just's description of GeoTracing:

GeoTracing (www.geotracing.com) is an extensible client/server framework for
GPS-based mobile tracklogging/media/feature entry and real-time tracing. LBS
functionality is planned. GeoTracing is geared at providing a foundation framework
for "Locative Media" applications, sports, games and dedicated applications like
animal field observation. Geotagged media has a prominent role in the framework.
Client/server interaction uses XML and Web 2.0 technologies like AJAX. The browser
currently uses Google Maps (with some WMS hacks). GeoTracing is in the process of
going Open Source (at geotracing.codehaus.org).

GeoTracing applications allow users to track and share their movement through the
landscape. While moving a user can enter impressions/annotations in the form of media
and features like Points of Interest. All data is archived/managed in a CMS.
Currently a mobile (smart)phone with Bluetooth GPS is used for entry. Through a
web-browser tracks/annotations can be viewed in real-time on a Google Map. GeoTracing
attempts to provide a foundation framework for specific applications. Several
applications have been realized or are under development. Examples are GeoSkating
(www.geoskating.com), Sense of the City (www.senseofthecity.nl), geodrawing
(www.n8spel.nl). See www.geotracing.com for more
applications. GeoTracing builds on other Open Source frameworks like KeyWorx
(www.keyworx.org developed by Waag Society) and Pushlets (www.pushlets.com developed
by the author). The server uses Java (J2EE), the mobile client (MobiTracer) is
implemented in Java (J2ME). The browser uses AJAX and Google Maps. All client/server
communication uses XML (Pushlets and KeyWorx extension protocols). Compliance with
OGC-protocols and the use of OSS Geo-software like MapBuilder is under study.

As he put it in his presentation yesterday, he is moving the GIS app beyond the traditional WHEN, WHERE to geo-story-telling and adds WHO+WHY. It easily allows for the type of storytelling usually seen on sites like 43places and Platial to a realtime mapping site, not on platforms.

Another project that he is working on is Bliin.com. This project, still in closed beta, allows a user to define a geographic area and topics of interest. Then, if a piece of media is uploaded to bliin that meets your criteria you are alerted and connected to the content owner. You can see a demo of the UI on the site. The source code is also available.

read more

Web 2.0 Expo: Call for Participation Is Open

by on Aug 23, 2007  

Yes, we are moving right along here in O'Reilly Conference Land. The latest roll-out is the Call for Participation for our brand new Web 2.0 Expo, the companion event to our Web 2.0 Conference. Web 2.0 Expo will feature workshops and sessions over six tracks, a ginormous exhibit hall (we hope, anyway), and networking galore.

You have until October 30 to submit a proposal; registration opens in the fall. (And I do mean "open"--no invitations required for the Web 2.0 Expo.)

read more

Take it to your customers

by nat on Sep 08, 2006  By nat

There was a line in this Idealog article that rang true with me: "We were always taught that brands are owned by the customers. But the changes in technology have accelerated that idea to make it a reality." Leaving aside the vacuity of the second phrase, the point I take away is that a lot of Web 2.0 is technology giving users more control over their experience. Case in point: Platial just launch map stylings. They let you customise the look of the map that you embed in your blog or other web page. They're letting their customers own the product. When the customer makes your product theirs by injecting their personality, needs, and life into it, you're doing something right.

read more

Craig Cline

by dale on Sep 06, 2006  By dale

Craig Cline passed away last weekend. He is probably best known for his role organizing the program for the Seybold Publishing Conference. He worked for a company whose name and shape changed drastically over time in the binge-and-purge 90's. The Seybold Conference, which usually occurred around this time of year in San Francisco, was one of its acquistions and one of its victims, coming to an end several years ago. Around that time, I talked to Craig about a new conference that O'Reilly and his company might collaborate on. I came up with the concept of the Web 2.0 Conference sitting in Craig's office, and he was the first person to say that it was a good idea.

Craig left the company by the time the Web 2.0 Conference came to fruition, and I ran into him several times afterwards. Frankly, he seemed mentally tired, still trying to figure out what it meant for him to move on. He'd talk about his gadgets and blogging about his political interests, and this seemed to stir him up.

Craig was big like a brown bear: slow and watchful, deliberate yet excitable. He had a unique perspective on the printing and publishing industry as it went through the dramatic changes brought about by the Web. He played a valuable role in introducing the agents of change to the old guard. My sympathy goes to his wife, Gayle, and his family. They have set up the CraigClineFriends website for him.

read more

Open Data: Small Pieces Loosely Joined

by tim on Sep 04, 2006  By tim

Chris Messina's blog entry about Gdata, entitled Building a Better Mousetrap, shows how people are waking up to the new platform wars. Chris singles out Google in his post:

This continued amalgamation of services behind the Google Account Authentication has consequences beyond the momentary outcry over Google’s supposed steamrolling of companies.

Business is business and competition is a threat to any member of an ecosystem, which is why you’ve got to keep innovating, adapting and bettering to survive. But it’s different when it comes to setting protocols and standards and the seamless moving of data in and out of disparate systems. When those protocols are closed or locked up or can be sealed off at any time, the competitive environment becomes very different.

The problem that I see is Google’s ability to shut out third party services once you’ve imported yourself into the proverbial gLife. No doubt there are feeds and the aforementioned GData APIs but it’s not an open system; Google decides which ports it wants to open and for whom. Think you’ll ever be able to cross-post calendar items from 30boxes to your Google Calendar? Only if Narendra strikes a deal on your behalf — even though it’s your data. Think you’ll ever be able to share your Picasa Albums with your Flickr account? Don’t bet on it. Or — or — how about sharing your Google search history with your Yahoo account? Or merging your buddy list between Orkut and Flickr? Not a chance.

I don't know that Chris is right to point the finger at Google -- I think that every one of the big platform players is doing the same thing, trying to gain competitive advantage by tying their services together. And Google, like Yahoo!, has done a whole lot to use open protocols as well as their own "embraced and extended" versions. And from conversations with Google management, I know that they think hard about the issue of how much to try to control, and how much to give up to keep the virtuous circle of the internet running. (I've had very specific conversations with them with regard to Google Book Search, and am convinced that Eric and other top managers really do understand the value of openness and interoperability, and want to maintain the same "neutral switchboard" position that Google has tried to find with web search.)

So as to Chris' "Not a chance," I reply: there is a good chance, if developers band together, think hard about what standardized protocols we need (and in which areas), and push for interoperability between systems in the area of who, what, when, where -- which are some of the key data subsystems, if you like, of Web 2.0 (Identity, for people and objects, respectively, and calendaring, taking into account both place and time.)

But like Chris, I do think that there's a battle going on for the heart and soul of the future internet.

read more

Google Image Labeler: Bionic Software Strikes Again

by tim on Sep 02, 2006  By tim

The launch of Google Image Labeler, a "game" that asks people to label images, and figures that images given the same label by multiple people are likely to be correct, continues the Web 2.0 trend towards bionic software, that is, software that combines machine and human intelligence. This is really just another version of the web 2.0 principle, harnessing collective intelligence, but with an emphasis on "harnessing" rather than on "collective."

Like Distributed Proofreaders (the granddaddy in the space), Amazon's Mechanical Turk, and mycroft, but unlike, say, a Flickr tag cloud as a reflection of collective labeling of images, Google Image Labeler puts people explicitly to work.

There's a spectrum of ways to put humans to work refining computer results, from the implicit to the explicit. The most explicit, of course, is going to be when the third world job shops now engaged in making booty for World of Warcraft start offering their services for more general hire.

read more

Introducing Web 2.0 Expo

by on Aug 23, 2007  

The O'Reilly Conference team and our partners over at CMP are delighted to announce a brand new bundle of conference joy: the first Web 2.0 Expo! Our newest addition to the conference family measures four days in length, weighs in at six parallel sessions and workshop tracks, and sports a full head of expo hall hair.

Enough is enough, I know! Read more about it over on the O'Reilly Radar and here's the official press release.

read more

Web 2.0 Expo and technical conference

by tim on Aug 30, 2006  By tim

The Web 2.0 Conference, launched in 2004, gave a name to the resurgence of the internet economy, and framed the ideas that are shaping that resurgence. The second conference, and the paper I wrote for it last year, What is Web 2.0?, laid out the principles that distinguish Web 2.0 applications from their predecessors. The third Web 2.0 conference, November 7-9 at the Palace Hotel in San Francisco, will once again bring together the technical innovators and business leaders behind the revolution, and highlight the drivers of success in the network era.

But there's a long way between the big framing ideas of web 2.0 and their practical application. What's more, the Web 2.0 Conference sold out last year and is on an invitation-only basis this year, with far greater demand than the event can accommodate. And that's why O'Reilly and CMP have today announced a second, companion conference, the Web 2.0 Expo, to be held April 15-18, 2007 at the Moscone West Convention Center in San Francisco. The Expo will include a four-day hands-on technical conference and tutorials as well as a trade show floor.

Every big idea needs implementation. We saw the need for a second event that focuses on how to actually build effective Web 2.0 applications. We're tackling not just Web 2.0 as strategy but also design, programming, operations, and viral marketing -- the elements of execution that will ultimately separate the winners from the me-too companies in the space.

In addition to big picture keynotes, a launchpad for new companies, and a full expo floor, the Expo will include a full technical conference, with tracks for programming and administration, design, and marketing as well as strategy and business models.

read more

Web 2.0 Trademark Redux

by tim on Aug 30, 2006  By tim

In conjunction with the announcement of the new Web 2.0 Expo and technical conference, I'm also pleased to report that CMP has agreed to narrow the scope of enforcement of the Web 2.0 trademark registration. It will only seek to protect the Web 2.0 trademark if another other Web 2.0-related event has a name that is confusingly similar to the names of the actual events co-produced by CMP and O'Reilly, such as our events "The Web 2.0 Conference" and "The Web 2.0 Expo."

This is consistent with my original understanding about why the trademark filing was made. I must confess that I've always thought that the point was simply to protect the event names, as evidenced by the fact that we have always put the trademark notice at the end of the conference names on the website that O'Reilly produces, "The Web 2.0 Conference."

For those of you who've been under a rock, and don't know why this is news, see the entries about the flap that erupted when CMP sent a C&D letter to IT@Cork, complaining about their IT@Cork Web 2.0 Conference, not knowing that I'd already given IT@Cork my blessing when they invited me to speak. CMP learned from the flap that the right way to engage those kinds of problems in the internet era is with a friendly email or phone call, rather than a lawyer's letter.

(I've been wanting to make this announcement for a while, but had to wait till the Expo (which was still in the planning and naming stages) had been launched.)

read more

The birth of Chumby

by tim on Aug 27, 2006  By tim

Last night, we had our first ever "product launch" at FOO camp. (Joshua Schachter says he almost showed del.icio.us at the first FOO in 2003, but wasn't quite ready, and released it a few weeks later.) We normally try to focus on pre-commercial technology at FOO, since by the time there are lots of startups, you're much further from the cutting edge. However, there are so many interesting aspects to the Chumby that we were excited to get behind it, and invite a whole bunch of the chumby team to Foo for the launch:

  1. Web 2.0 meets low-end consumer devices. Chumby is a kind of web-enabled wifi clock radio, with user-generated konfabulator-style info-widgets. There is an initial set of widgets, but the goal is for the community to extend the set. The value of the device is in the service of delivering new net-connected widgets, not in the hardware, or even the software.
  2. Open source hardware. This is a major emergent theme at FOO. (Remember that we don't organize the program around concepts so much as we organize it around people. We find cool people, and they tell us what they're doing, often surprising us by the things they do that we didn't know about.) As there's more Make: style hardware hacking, there's a need for new tools for sharing the details of projects, for thinking through licenses, and the like. (A great example of FOO cross-fertilization: Colin Cross, also at FOO, is working on a linux-powered open source hardware mobile phone, the TuxPhone (project coming soon on SourceForge.) He was excited to meet the Chumby folks to pick their brains about their license.)
  3. Open source software. In a session on forecasting, Paul Saffo remarked on the importance of paying attention to anomalies. The example he gave was a highway road sign that stated "Leaving emergency road side phone service area." This shift from communications being the exception to communications being the norm alerted him to the idea that communications, not processors, would be the driving force of technology in the 90's. So I think it's fascinating that Chumby puts a sticker on the back noting that all the software is open source except flash, which they use for some functions.
  4. Soft hacking, aka Craft:. Chumby CEO Steve Tomlin remarked that Chumby is a device that "you can hack with a seam ripper." Unlike most other consumer electronic devices, it comes with a bean-bag style case (complete with various kinds of sensors so that squeezes and bumps control activity) that can be modded with a sewing machine. Want a hello-kitty version? It's up to you.
  5. They wanted to see how a hacker audience would respond. The device is designed to be hackable on every level, so this is a great alpha group for seeing where a hacker community will take a device like this, as well as whether or not they respond. The prototypes are still under development, and the people at FOO who've gotten their hands on them thus become co-developers of the product before manufacturing goes to scale.
  6. The idea was born at FOO. Key members of the Chumby team met at previous FOO Camps, and the idea was an outgrowth of the ferment we encourage here. So it's a great proof point of what we're trying to accomplish: pack enough smart people into a compact space that they create enough heat that ideas boil over.

Christine Herron gives a great summary of the product launch. The Chumby site has lots more info.

read more

Why Seth Godin's Web 2.0 Watch List Misses the Point

by tim on Aug 24, 2006  By tim

Seth Godin's Web 2.0 Watch List, built with alexaholic, is a fabulous idea. I wish I'd thought of it. Rankings are fun and thought-provoking, and as Seth figured, they also get lots of attention!

But Seth completely misses the point of Web 2.0 when he says "For our purposes, my definition is that most of these companies are, as the wikipedia says, sites that 'let people collaborate and share information online in a new way.' So, Google doesn't make the cut, because most of their traffic comes to their search engine. eBay is an "old" company, but the many-to-many nature of the site means that they do."

What a short-sighted definition! It seems to focus much more on the Web 1.0 aspect of explicit "community" than the web 2.0 concept of harnessing collective intelligence. If Google doesn't do this, I don't know who does! PageRank was a breakthrough in search because Google figured out that what people do (make links) is as important as what the documents contain. And Adsense is profoundly participatory. Amazon too is a master at harnessing participation but doesn't make Seth's list. Meanwhile, we see companies like pandora, which are purely algorithmic in their personalization, and zillow, which is database driven without any community features. These sites are incredibly useful applications, but Seth's list sure seems to draw the Web 2.0 boundary in some very odd ways.

In addition, even participation (both explicit and implicit) is only part of web 2.0. Can someone really say that Google Maps or Amazon's Simple Storage Service is not Web 2.0? And why are open source projects like Django and Drupal included but not Apache or Perl's CPAN, or sourceforge and slashdot? And if we're including invididual blogs, why zefrank.com and not techcrunch?

In short, this is a brilliant idea that needs some serious tweaking.

read more

bullshitr

by tim on Aug 23, 2006  By tim

Don't know why this didn't get noticed before: the excellent Web 2.0 bullshit generator. Not as funny as Web 2.0 or Star Wars character?, but still worth a spin. In the heyday of the last bubble, I remember going into a conference room in a shared-office complex (so I don't know whose bs it was), seeing the business plan brainstorm on the whiteboard, and thinking that someone ought to come out with a magnetic poetry kit (in that case, a magnetic bs kit) for generating the style of business plan that was on that board. This is the electronic version of the same idea.

Ultimately, success in any business is about creating value, not about being buzzword compliant. So sites that make fun of the buzzwords provide a very useful reminder.

read more

Smugmug's version of Interestingness

by tim on Aug 23, 2006  By tim

Chris McAskill of SmugMug wrote in email:

I noticed your fascination with Interestingness (we're fans too) and thought you'd appreciate a pretty compelling variation. We silently introduced this a couple of months ago to see how it would work without telling anyone.

Our customers tend to be families, soccer moms, travelers...not as geeky as Flickr's customers. They understand words like most popular and understand star ratings and thumbs up/down. They hate to be asked to register to leave a comment, even if it means just giving their email, and even writing a comment is kind of a hassle 80% of the time.

So we instituted a not-very-gameable (you could, but you'd have to be determined) simple thing: when you hover your mouse over an image, you get a thumb-up or thumb-down. No need to register.
We combine the thumb ratings with comment ratings and end up with a browse page of most popular photos for the day that gets over a million people per day dropping by to see and cast their votes.

Almost all our subscribers let democracy speak and flaunt their most popular photos on their home pages: [here's mine]. And communities do the same. Here's >a flowers community. You can sort by most popular in any category or keyword, etc.... Pretty nifty? It's a different way to skin the cat for a broad, consumer audience.

Whether it's pagerank at Google, interestingness at Flickr, or diggs, or SmugMug's most popular feature, we see all across the web attempts to incorporate human intelligence into web applications. As I've written many times, harnessing collective intelligence is the very heart of Web 2.0. And that intelligence is distinguished by its bionic nature: we're building applications that are a fusion of human and machine.

read more

Web 2.0 Launchpad

by tim on Aug 23, 2006  By tim

At the Web 2.0 Conference, we have a feature called the Launch Pad, which is the opportunity for a new company to make their debut. This year, John Battelle, the program chair for the Web 2.0 Conference, has recruited a stellar team to help us winnow through the companies applying to be featured. It includes VCs, M&A strategists from some of the major companies in the Web 2.0 talent hunt, and a couple of Web 2.0 pundits (besides me and John, that is), so it's a great opportunity to get some exposure, even if you don't get selected for one of the Launch Pad slots. For a complete list of the Launch Pad advisory board, and more details, see John's posting, Help us find the companies that will launch at Web 2.0.

We have more than a hundred submissions already, competing for a dozen or so slots on the conference program, but there are so many interesting new companies popping up that we could easily miss some really interesting ones. So, like John, I'm asking you to let us know if you want to be considered for the Launch Pad at Web 2.0.

read more

Home Solar as User Generated Content

by tim on Aug 20, 2006  By tim

In a conversation the other day, Ed Kummer of Disney made a really thought-provoking observation: the spread of solar energy units to homes and businesses is an analog to other forms of user-generated content, and the overall trend towards a two-way network. While it's possible to set up a solar system completely off the grid, most of the new customers feed power into the grid during sunlight hours, and draw from it when the daylight wanes. If we move to a solar power economy, it will be much more distributed and cooperative than the current one-way model.

It's fabulous to put the internet and Web 2.0 into a broader context, and to think about how the new network economics that we're seeing on the internet may be adopted in other fields. With VoIP, we're seeing the internet subsume the telephone network. With distributed solar, and the kinds of distributed energy monitoring technology that Adam wrote about the other day, will the internet model also colonize the power grid?

Hmmm... What was I saying about the internet as the network of networks?

read more

The Three Faces of Steve

by tim on Aug 19, 2006  By tim

There's a very thought-provoking article on the O'Reilly Network's MacDevCenter, reading between the lines of Steve Jobs' WWDC keynote:

After welcoming the audience of developers, Jobs let the audience know that others would help him on stage. This, in and of itself, was unusual. There are often supporting roles in the WWDC and MacWorld keynotes but only one featured artist. Not only did Jobs share the stage with Bertrand Serlet, Phil Schiller, and Scott Forstall, but he allowed them to make many of the morning's announcements. In a way, they represented the three faces of Steve.

In his email newsletter, MacDevCenter editor Derrick Story expanded on this point: "As these Apple heavy hitters made the various announcements that Steve often handles, I couldn't help thinking that Apple once again is planning ahead. Steve Jobs won't be CEO forever. Others are going to have to share the heavy lifting." Thought-provoking.

The article also had a couple of other tidbits that struck me:

  1. Continuing with my thoughts about application dialtone, the new Time Machine feature could be described as dialtone for backups: "This new Mac OS X application is designed to help users backup and restore their data. Forstall said that in their estimation only 26 percent of users back up their data. Most of these, however, do so in a manual and ad hoc way. Every once in a while they burn some files to a CD. Only 4 percent have a regular automated backup strategy. Time Machine is automatic backup for the Mac.... The Time Machine UI is stunning. It allows people to look at a directory and zoom back in time until they find the file that is missing or has been changed."

  2. As further evidence of the trend we've been watching about the importance of power rather than pure performance as one of the key competitive factors in the coming market, "The Mac Pro is built around a pair of Dual-Core Intel Xeon 5100 processors available at 2GHz, 2.66GHz, or 3GHz.... Schiller stressed the improvements in performance per watt saying that this new chip is three times more efficient than the G5." While Apple's is a consumer offering, power per watt becomes even more important in data-center heavy Web 2.0 applications.

read more

Round 2: The Internet As Network of Networks

by tim on Aug 19, 2006  By tim

The other day, I was explaining to a reporter how I could be lumping in cellphones and the next generation of sensor networks into Web 2.0. "Well, Web 2.0 is really not just about the web. It's really about the next generation of internet applications, and includes things like P2P file sharing and VoIP, which aren't based on the web at all. And actually, now that I mention it, it's really not even about the internet, narrowly defined as a class of TCP/IP-based networks. It's really about the internet as it was originally conceived, as a 'network of networks.'"

Those of you who were around in the 1980s will know just what I'm talking about, because that was originally what people meant by "the internet." The term literally came into wide use to refer to a whole set of distinct but increasingly interoperable networks: the ARPAnet, CSnet, DECnet, the UUCPnet, EUnet, NLnet, FIDOnet.... In the late 80s, we actually published a book called !%@: A Directory of Electronic Mail Addressing and Networks, which covered how to address email across 190 distinct networks. (The title !%@ was homage to some of the many special addressing characters that were used before the @ crowded out all the others.) The inter-net was the interoperable network that came to connect them.

It's good to remember this broad definition of "the internet," because the internet is not just about TCP/IP, though it is about the principles that made the TCP/IP based network win out over all the others, and become the lingua franca of interoperability that it is today. We're pushing the boundaries of the old internet, as it comes to include the cellphone network, telematics networks, and other emerging forms of connectivity.

So let's ask, where else can we apply the principles that we're learning from the internet?

Round 2: A series of occasional postings around the theme that patterns and ideas recur, or as Arlo Guthrie said in Alice's Restaurant, "come around again on the gee-tar."

read more

Round 2: Dial Tone

by tim on Aug 18, 2006  By tim

In a conversation yesterday, John Fandel, the general manager of The O'Reilly Network, made an interesting point: he wants to build our web publishing tools around the model of delivering "dial tone."

As we talked, the idea took hold. I was reminded of Michael Crichton's observation in his 1983 book Electronic Life that in the 1940's there was concern that the telephone system was growing so fast that there wouldn't be enough operators unless AT&T hired every person in America. AT&T solved the problem by creating automated switching systems that, in effect, did turn every person in the world into an operator--without hiring them. The principle of dial tone is to create a situation where users can do something for themselves that once required the intervention of an operator.

Dial-tone is also a fabulous metaphor for one of the key principles of Web 2.0, which I've called "the architecture of participation," but which might also simply be described as the design of systems that leverage customer self-service. (Bill Janeway made this linkage to customer self-service as a key driver of success in the internet economy in a presentation he gave at the O'Reilly Emerging Technology Conference in 2004. Mitch Ratcliffe blogged his notes from a similar talk that Bill gave at a Red Herring conference.)

You can regard the history of the computer industry as pushing "dial tone" further and further up the stack. As Crichton noted, the rotary dial telephone was the first computer that allowed direct interaction between humans and computers. The personal computer pushed customer self service up the stack to programming, data processing, and eventually applications such as word processing and spreadsheets.

New applications often start out requiring operators, but eventually move towards dial-tone. For example, you can look at blogging as the "dial tone" equivalent of creating a web site. For ordinary folks (not most of my readers, but non-technical folks), creating a web site was something that required an operator. You went to a web design shop or an ISP and had them do it for you. The blogging revolution, the wiki revolution, the MySpace revolution, the CyWorld revolution, are really about providing a kind of self-service dial-tone for creating a web presence and community. P2P applications are dial tone for file transfer. sourceforge and collab.net are software project hosting dial-tone. Craigslist is classified advertising dial-tone.

Similarly, you can look at personal databases like Access and Filemaker, and open source databases like MySQL as moving in the direction of providing database dial tone.

Once you frame the problem in this way, you understand that one of the challenges for IT departments and companies used to the IT mindset is to get the operators out of the way, and to build new processes that let users do the work for themselves. You also can ask yourself, where is dial tone going next?

Round 2: A series of occasional postings around the theme that patterns and ideas recur, or as Arlo Guthrie said in Alice's Restaurant, "come around again on the gee-tar."

read more

Google Earth and Emergency Aid

by tim on Aug 16, 2006  By tim

According to an entry on worldchanging.com, Google Earth just played a role in helping to target air drops of relief supplies in Gujarat, which was hit with serious flooding. The entry cites an Ahmedabad newspaper:

If [officials] could have struck upon this idea before, it would have helped many more people as carpet air-dropping of aid leads to lots of wastage. Using this tool, it was easy to identify buildings and other landmarks.

Sometimes people complain that Web 2.0 is just a consumer internet thing. But stories like this remind us that the increased intelligence available to ordinary people can have worldchanging consequences. In this particular case, it was two ordinary people who persuaded the air force to use Google Earth to better target their aid and rescue efforts.

read more

The Times: You Can't Fool the Children of the Revolution

by on Aug 23, 2007  

Jonathan Webber mentions one of last year's Web 2.0 Conference panels led by Safa Rashtchy in an article on how media is changing:

It's common these days to hear businessmen talk about their children's habits; a focus group of one, or two, or three may not be very scientific but, hey, it's better than total ignorance. At the Web 2.0 conference last year there was actually a panel of teenagers, moderated by a Wall Street analyst, talking about the merits of MySpace versus Facebook and the like.

read more

Web 2.0: Real Time Platforms

by tim on Aug 09, 2006  By tim

The post that Nat made this morning about Mark Lucovsky and the Google AJAX search API has some wonderful backstory. Mark is just a fascinating guy, and has a unique perspective on how platforms are changing. He's a big thinker, and has a great way of capturing the fundamental differences between developing for a web 2.0 platform like Google vs. a PC platform. I wanted to share some of the comments that Mark made during the email thread that started with his request to "mash up" an OSCON web page with the new Google search APIs.

We first met Mark in 2001, when he keynoted at our P2P and Web Services Conference, speaking about Microsoft's ambitious Hailstorm initiative. (Before leaving Microsoft, Mark wrote a great summary of how, despite its bad rap, Hailstorm prefigured much of the really great internet-as-platform goodness that we're now celebrating as Web 2.0. I've always felt that if it hadn't come from Microsoft at the height of the antitrust brouhaha, Hailstorm would have been a huge success. It had so many of the right ideas.) Last year, Mark went to Google, where he is now building a set of AJAX interfaces and "gadgets" that make it as easy to use all of the Google search apis as it is to build a web page.

The thread started with a simple request from Mark. "I am speaking at SES this wednesday on search apis. I am not sure everyone really understands the concept of search mashups so I wanted to show a real example, but do it in the context that everyone at the conference would understand... " He wanted to take the OSCON web page, add a Google Map Search widget to it, and use it in a slide at his Search Engine Strategies talk. I thought what he was doing was cool, and asked if I could blog his email. In the course of the ensuing email exchange, though, his infectious enthusiasm for the new platform bubbled over:

I tell you, doing platform development, in real time, with no friction is rule changing.... How odd is it that I can just tell you that I will write the code tonight or tommorrow and then whenever I feel like it, push a button that makes it available to the entire world? Have you ever worked with a platform like this before?

This last comment echoed a conversation that Mark had made when I first met with him about his work at Google a couple of months ago, and which led me to invite him to the O'Reilly Radar Executive Briefing at OSCON. So now I had to ask if I could blog THAT.

And it kept getting better. Mark is such an articulate spokesman for how the rules of software development are changing! More than anyone else I've talked with, he has a keen sense of contrast between the old world of software development and the new one. He wrote:

read more

Skyrider: Commercializing P2P

by tim on Aug 03, 2006  By tim

A few weeks ago, I had dinner with my old friend Ed Kozel, and he gave me a peek at what's he's been working on, a new company called Skyrider that's building a platform for building commercial P2P services. Skyrider still isn't saying that much publicly about what they're doing -- their press release (to which I contributed a quote) is mainly socializing the idea that P2P networks haven't gone away and in fact, have become bigger than the web as a source of internet traffic -- but I can tell you that if they can pull off even 10% of their ambitious vision, it's going to have a heck of an impact. Just as Google crystallized a new vision of how to monetize the web, leading to the explosion of innovation that we now call Web 2.0, Skyrider is building technologies that could bring a rich new commercial ecosystem to P2P. I can't wait till they start actually rolling out their services.

Speaking of Web 2.0, it's worth noting that a lot of my thinking about what we're now calling Web 2.0 didn't start with the web at all. It started with peer-to-peer file sharing, which was one of the first applications to realize the full power of the internet as platform. After Napster burst onto the scene in 2000, O'Reilly editor Andy Oram wrote a couple of articles pointing out that P2P wasn't just about file sharing, and the legality or lack thereof, but about the next generation of Internet applications. That woke me up big time. As I'd done with the open source summit in 1998, I decided the best thing to do was to pull together some of the best minds working on this issue, and see what they had to say to each other.

(I wrote an account of that summit in my article Remaking the Peer to Peer Meme, which also appeared in the our 2001 book Peer to Peer. Those of you who have seen the "meme map" I did for What is Web 2.0? will see its roots in the earlier ones I did for open source and p2p, as outlined in that article. Andy also wrote an account under the provocative title "Peer to Peer Makes the Internet Interesting Again. Jon Udell also wrote a great first hand account of the P2P summit. Meanwhile, our first P2P Conference eventually morphed into both Etech and The Web 2.0 Conference.)

File sharing went underground because of legal issues, but the technology continued to develop, and became even more powerful. Now, we're seeing a new wave of startups that are finding ways to leverage the power of P2P while creating legitimate commercial opportunities. Skype, from the founders of Kazaa, took P2P into another, less controversial application. Skyrider looks P2P file sharing in the face, and asks, can we make the financial opportunities so obvious that content owners and distributors embrace it in the light of day? Stay tuned.

read more

Job Trends: Web 2.0, AJAX and Ruby

by tim on Aug 03, 2006  By tim

In my posting Ruby Book Sales Pass Perl, I also showed some treemaps highlighting the growth of calls for Ruby skills in the job market. I mentioned the fact that Web 2.0 also was bright green in the jobs treemap. But I thought I'd give a little more detail on how these job categories are growing.

In order to understand the job growth graphs, you need to keep in mind that the number of job listings vary from month to month. Accordingly, we set the "job market size" to an arbitrary value at the start of our data in June 2005, and then show the increase (or decrease) relative to that number. So what we're comparing in the graph below is the relative month-to-month growth of the overall job market (in red) to the relative month-to-month growth of the specific technology job market (in blue). We're not comparing the absolute size of those markets.

Here's the growth in jobs calling for Web 2.0 expertise, with June 2006 up about 4200% over June 2005:

web2jobtrend.png

Note how the Web 2.0 drumbeat begins at around the time of our second Web 2.0 Conference in the fall of 2005.

Here are the graphs for AJAX and Ruby. Note that these are not on the same scale as the graph for Web 2.0. AJAX job postings are up about 1400%, and Ruby job postings up about 500%:

read more

Programming Language Trends

by tim on Aug 02, 2006  By tim

In his presentation at OSCON, Roger Magoulas, the director of O'Reilly Research, provided some interesting graphs based on our trends data mart. Here's the 3-year programming language market share trend based on computer book sales:

Programming Language market share trend in computer books

I wrote yesterday about the rise of Ruby and Javascript, driven by the move towards Web 2.0 applications. Also worthy of note in these graphs is the long, slow decline of Java and C/C++, and the continuing rise in market share of C#. You can see how Ruby's sharp ascent follows the introduction of Rails, and that PHP's fortunes reversed before book sales showed that web developers in search of rapid development languages moved over to RoR (and Microsoft's ASP.Net suite of technologies.)

Also worth noting on this chart is that book sales tend to spike even before the release of commercial languages, as the vendors work with publishers to get books out by the release date, while books on open source projects tend to trail the release date. On the one hand, you could interpret this as meaning that open source is slower to market, but I think it says the opposite: open source projects can move fast, and catch publishers by surprise. Few publishers expected the quick uptake of Java AJAX and Ruby on Rails.

read more

Programming Language Trends

by tim on Aug 02, 2006  By tim

In his presentation at OSCON, Roger Magoulas, the director of O'Reilly Research, provided some interesting graphs based on our trends data mart. Here's the 3-year programming language market share trend based on computer book sales:

Programming Language market share trend in computer books

I wrote yesterday about the rise of Ruby and Javascript, driven by the move towards Web 2.0 applications. Also worthy of note in these graphs is the long, slow decline of Java and C/C++, and the continuing rise in market share of C#. You can see how Ruby's sharp ascent follows the introduction of Rails, and that PHP's fortunes reversed before book sales showed that web developers in search of rapid development languages moved over to RoR (and Microsoft's ASP.Net suite of technologies.)

Also worth noting on this chart is that book sales tend to spike even before the release of commercial languages, as the vendors work with publishers to get books out by the release date, while books on open source projects tend to trail the release date. On the one hand, you could interpret this as meaning that open source is slower to market, but I think it says the opposite: open source projects can move fast, and catch publishers by surprise. Few publishers expected the quick uptake of Java and Ruby on Rails.

read more

Open Source Licenses are Obsolete

by tim on Aug 01, 2006  By tim

Last week at OSCON, I made the seemingly controversial statement "Open Source Licenses Are Obsolete. During the Q&A period, Michael Tiemann of Red Hat and the Open Source Initiative took issue with my statement, pointing out just how much value open source licenses have created. I don't know whether he really didn't understand what I was saying or whether he was just intentionally misunderstanding to make his own point. But it's clear to me at least that the open source activist community needs to come to grips with the change in the way a great deal of software is deployed today.

And that, after all, was my message: not that open source licenses are unnecessary, but that because their conditions are all triggered by the act of software distribution, they fail to apply to many of the most important types of software today, namely Web 2.0 applications and other forms of software as a service.

I've been banging this drum for many years. In fact, in preparing for my talk, I looked up an old discussion I'd had with Richard Stallman in Berlin during the summer of 1999. I had just given a talk (pdf) on what I was then calling infoware and now call Web 2.0, and made my point about the failure of open source licenses in the world of software as a service. Richard came up to the mike after my talk, and said:

read more

Webtide Launches at OSCON

by on Aug 23, 2007  

Webtide is a new services company that provides training, development and support services for Web 2.0 applications. Webtide was formally launched last week during OSCON. Read more.

read more

Building Scalable Websites

by tim on Jul 30, 2006  By tim

I was delighted to see Cal Henderson's Building Scalable Websites on Amazon's computer book bestseller list this morning. Cal's talk, How We Built Flickr, has become legendary, and the book captures many of the same insights about how to build software for Web 2.0.

calonamazon2.png

(Image updated 7/31 at 12:30 -- originally at #19, the book is at #11 right now.)

read more

Creating Scalable Websites

by tim on Jul 30, 2006  By tim

I was delighted to see Cal Henderson's Building Scalable Websites on Amazon's computer book bestseller list this morning. Cal's talk, How We Built Flickr, has become legendary, and the book captures many of the same insights about how to build software for Web 2.0.

CalOnAmazon.jpg

read more

Press Release: MindTouch Unveils Open Source Web 2.0 API for Microsoft .NET

by on Aug 23, 2007  

It's been a good day for announcements at OSCON. The latest (PDF) is from sponsor (thanks!) MindTouch:

MindTouch Unveils Open Source Web 2.0 API for Microsoft .NET
Web 2.0 applications made easy with MindTouch Dream

O'Reilly Open Source Convention (OSCON), Booth 911, Portland, OR - July 25, 2006 - MindTouch, a leading provider of on-site Web appliances for the small- to medium-sized business (SMB) market, today announced MindTouch Dream, a powerful open source Web 2.0 platform and software development kit (SDK) built on Microsoft Visual Studio(r) .NET.

read more

Press Release: ActiveState Previews Komodo 4.0 at O'Reilly Open Source Convention

by on Aug 23, 2007  

Longtime OSCON supporters ActiveState (thank you!) made this announcement:

ActiveState Previews Komodo 4.0 at O'Reilly Open Source Convention; Award-winning IDE for dynamic languages adds end-to-end web development capabilities

PORTLAND, Ore.--(BUSINESS WIRE)--July 25, 2006--ActiveState Software Inc., the leading provider of tools and services for dynamic languages, today announced the technical pre-release of Komodo 4.0, introducing advanced support for Web 2.0 technologies to the award-winning IDE for dynamic languages. The release is available for download at: http://www.activestate.com/komodo/.

read more

InfoWorld: Tim O'Reilly Opens Up OSCON

by on Aug 23, 2007  

Matt Asay, co-organizer of the O'Reilly Radar Executive Briefing and all-around open source go-to guy, writes:

Tim is in the middle of his opening keynote at OSCON, talking about the Big Ideas that are driving open source.

Tim just made an interesting point, one that I first heard r0ml make years ago at eGOVOS: open soure is about efficient free markets, not licenses. Why does open source work (and why does Web 2.0 properly architected, work)? Because it feeds off the self-interest - not generosity - of individual developers. People scratch their own itches, and the overall community of software grows. Just as Adam Smith and free market economists have always said it would.

read more

Four Big Ideas About Open Source

by tim on Jul 19, 2006  By tim

In my O'Reilly Radar Executive Briefing next week at OSCON, I'm focusing on four big ideas about open source:

  1. The architecture of participation beyond software. Software development was the canary in the coalmine, one of the first areas to show the power of self-organizing systems leveraging the power of the internet to transform markets. But it didn't stop there. What we're now calling Web 2.0 is a direct outgrowth of the core principles that made open source software successful, but in my opinion, many of the projects and companies that make up the Web 2.0 movement have gone far beyond open source in their understanding of how to build systems that leverage what I call the architecture of participation.
  2. Asymmetric Competition. One of the most powerful things about open source is its potential to reset the rules of the game, to compete in a way that undercuts all of the advantages of incumbent players. Yet what we see in open source is that the leading companies have in many ways abandoned this advantage, becoming increasingly like the companies with which they compete. I have no concerns about the ultimate health of the open source development model or the vibrant creativity of the open source community, but I do question whether open source companies really grasp the implications of the new model. I think that if they did, they'd be Web 2.0 companies.
  3. How Software As a Service Changes The Points of Business Leverage. Operations and scalability lead to powerful cost advantages; increasing returns from network effects lead to new kinds of lock-in. The net effect is that even when running open source software, vendors will have lock-in opportunities just as powerful as those from the previous generation of proprietary software.
  4. Open Data. One day soon, tomorrow's Richard Stallman will wake up and realize that all the software distributed in the world is free and open source, but that he still has no control to improve or change the computer tools that he relies on every day. They are services backed by collective databases too large (and controlled by their service providers) to be easily modified. Even data portability initiatives such as those starting today merely scratch the surface, because taking your own data out of the pool may let you move it somewhere else, but much of its value depends on its original context, now lost.
In short, you can see that I believe that there are serious challenges to the open source model. For all its success (and that success has been world-changing), it's important not to get complacent. The world is changing under our feet! The pendulum always swings between open and proprietary, and despite the apparent progress of open source and open standards, right now the pendulum is swinging the other way.


 I have always felt most passionate in preaching to the open source community. It is one that I love and esteem most highly. But it is also one that is in great danger of increasing irrelevance, because some of the premises on which it has based its thinking are wrong. I've always had a perspective a bit at variance from those of other open source advocates, and as the future unfolds, I continue to feel that that perspective is essential for open source strategists to understand and embrace.

(For a succinct recap of the evolution of that perspective, see my 1998 essay, Hardware, Software and Infoware, my 2003 essay, The Open Source Paradigm Shift, and my 2004 essay, What is Web 2.0? Also, see my previous entry about the Radar Open Source briefing session.

read more

Concentric Circles of Web 2.0 Heaven

by tim on Jul 18, 2006  By tim Paul Kedrosky neatly summarized the gist of my post about the hierarchy of Web 2.0 applications in language worthy of Nat Torkington: "In short: A full-bore W2 app is a data-eating and excreting combination of Internet parasite and host." Colorful, memorable, and spot on. read more

Lazyweb: My GMail Wishlist

by tim on Jul 18, 2006  By tim

I dropped something heavy on my laptop the other day just as the system was rebooting, and managed to trash the disk. While it's in the shop, I've had to rely on GMail as my main mail application. (I normally read mail with mail.app on the Mac, but have everything autoforwarded to GMail for backup and access from other computers.) Using GMail full time reminds me of all the things that I think it ought to do as a mail client from Google, which in theory is one of the internet-savviest companies around (see Levels of the Game), as well as things that I expect as normal good behavior, plus some things that I miss from old Unix mail clients like mh and mush/zmail.

Right now, GMail is operating at the lowest level of Web 2.0. It uses Ajax to give an online experience, but it actually doesn't do anything that can't be done in a local mail client. (And actually, since all mail is networked, even if the client itself is local, these same features could be implemented by anyone, from Microsoft to Apple. But it's GMail that's on my mind right now....) If anyone at Google is reading, please put these features on your punch list:

read more

Another CAPTCHA -- But I failed (partly)

by tim on Jul 17, 2006  By tim

On my last entry, synonymous commented: "I like the idea, but I think this captcha mashup may actually be more effective." I cracked up when I saw this. It uses "the hotornot API" (Web 2.0 is getting out of hand!) to offer up pictures of nine women (or men) and asks you to prove you're human by selecting the three "hot" ones. Politically incorrect, funny and thought-provoking, and potentially flawed by sexual bias to boot. I sent on a link to the radar team with the comment: "I have no problem with picking out the 'hot' women, but I struggle with the men. Some of the obviously 'hot' ones (presumably the right answers) look like doofuses to me, while with women, the mentula ("little mind" -- Latin slang for the male anatomy) easily gets it right. I wonder if women find the men as easy, and if so, how much tastes vary."

read more

Levels of the Game: The Hierarchy of Web 2.0 Applications

by tim on Jul 17, 2006  By tim

Reading Jim Fallows' new Technology Review article about his experiment in using only Web 2.0 applications for two weeks, I think: "What an odd thing to do! It's a bit like evaluating the utility of an automobile by foregoing your bedroom and sleeping in the back seat of your car for two weeks." Fallows is insightful, and he makes some good points (more on that later), but he also reveals just how hard it is for people to wrap their heads around Web 2.0. He says "Web 2.0's most important step forward seems to be the widespread adoption of Ajax." Alas, that is a common misconception.

Just because something uses Ajax and is presented on the web doesn't make it a Web 2.0 application. (Fallows does cite my What is Web 2.0? article in evaluating the first app he mentions, Dodgeball, but he doesn't apply much rigor to the other apps that he talks about. For example, he takes writely as one of his test cases, and then judges the merits of Web 2.0 by how using an online application like writely stacks up to a local application like Word. Writely is interesting, but it's hardly a canonical Web 2.0 application.)

The confusion leads me to think about a hierarchy of "Web 2.0-ness":

Level 3: The application could ONLY exist on the net, and draws its essential power from the network and the connections it makes possible between people or applications. These are applications that harness network effects to get better the more people use them. EBay, craigslist, Wikipedia, del.icio.us, Skype, (and yes, Dodgeball) meet this test. They are fundamentally driven by shared online activity. The web itself has this character, which Google and other search engines have then leveraged. (You can search on the desktop, but without link activity, many of the techniques that make web search work so well are not available to you.) Web crawling is one of the fundamental Web 2.0 activities, and search applications like Adsense for Content also clearly have Web 2.0 at their heart. I had a conversation with Eric Schmidt, the CEO of Google, the other day, and he summed up his philosophy and strategy as "Don't fight the internet." In the hierarchy of web 2.0 applications, the highest level is to embrace the network, to understand what creates network effects, and then to harness them in everything you do.

Level 2: The application could exist offline, but it is uniquely advantaged by being online. Flickr is a great example. You can have a local photo management application (like iPhoto) but the application gains remarkable power by leveraging an online community. In fact, the shared photo database, the online community, and the artifacts it creates (like the tag database) is central to what distinguishes Flickr from its offline counterparts. And its fuller embrace of the internet (for example, that the default state of uploaded photos is "public") is what distinguishes it from its online predecessors.

Level 1: The application can and does exist successfully offline, but it gains additional features by being online. Writely is a great example. If you want to do collaborative editing, its online component is terrific, but if you want to write alone, as Fallows did, it gives you little benefit (other than availability from computers other than your own.)

Level 0: The application has primarily taken hold online, but it would work just as well offline if you had all the data in a local cache. MapQuest, Yahoo! Local, and Google Maps are all in this category (but mashups like housingmaps.com are at Level 3.) To the extent that online mapping applications harness user contributions, they jump to Level 2.

You'll notice that I didn't mention either Amazon in the hierarchy above. That's because I can't decide whether they belong on level 2 or 3. One can imagine an Amazon-style product catalog offline (for example, in a store), but Amazon is so persistent in harnessing online participation that they have almost managed to transcend the limits of their category. They've also built services, from Associates to S3, that make them completely a network citizen. So call them level 3, and a testament to the power of strategic effort to change the game.

read more

O'Reilly Radar: The Executive Briefing--Tim's Take

by on Aug 23, 2007  

Fearless Leader Tim O'Reilly has also posted thoughts on the upcoming Executive Briefing happening at OSCON next week:

I'm focusing the program around the intersection of Open Source and Web 2.0, because I continue to believe that even though Web 2.0 is deeply rooted in open source, it's the open source community that most needs to be reminded of the connection. The world that gave birth to the free and open source software communities has changed radically, and open source needs to change as a result.

read more

Levels of the Game: The Hierarchy of Web 2.0 Applications

by tim on Jul 17, 2006  By tim

Reading Jim Fallows' new Technology Review article about his experiment in using only Web 2.0 applications for two weeks, I think: "What an odd thing to do! It's a bit like evaluating the utility of an automobile by foregoing your bedroom and sleeping in the back seat of your car for two weeks." Fallows is insightful, and he makes some good points (more on that later), but he also reveals just how hard it is for people to wrap their heads around Web 2.0. He says "Web 2.0's most important step forward seems to be the widespread adoption of Ajax." Alas, that is a common misconception.

Just because something uses Ajax and is presented on the web doesn't make it a Web 2.0 application. (Fallows does cite my What is Web 2.0? article in evaluating the first app he mentions, Dodgeball, but he doesn't apply much rigor to the other apps that he talks about. For example, he takes writely as one of his test cases, and then judges the merits of Web 2.0 by how using an online application like writely stacks up to a local application like Word. Writely is interesting, but it's hardly a canonical Web 2.0 application.)

The confusion leads me to think about a hierarchy of "Web 2.0-ness":

Level 3: The application could ONLY exist on the net, and draws its essential power from the network and the connections it makes possible between people or applications. These are applications that harness network effects to get better the more people use them. EBay, craigslist, Wikipedia, del.icio.us, Skype, (and yes, Dodgeball) meet this test. They are fundamentally driven by shared online activity. The web itself has this character, which Google and other search engines have then leveraged. (You can search on the desktop, but without link activity, many of the techniques that make web search work so well are not available to you.) Web crawling is one of the fundamental Web 2.0 activities, and search applications like Adsense for Content also clearly have Web 2.0 at their heart. I had a conversation with Eric Schmidt, the CEO of Google, the other day, and he summed up his philosophy and strategy as "Don't fight the internet." In the hierarchy of web 2.0 applications, the highest level is to embrace the network, to understand what creates network effects, and then to harness them in everything you do.

Level 2: The application could exist offline, but it is uniquely advantaged by being online. Flickr is a great example. You can have a local photo management application (like iPhoto) but the application gains remarkable power by leveraging an online community. In fact, the shared photo database, the online community, and the artifacts it creates (like the tag database) is central to what distinguishes Flickr from its offline counterparts. And its fuller embrace of the internet (for example, that the default state of uploaded photos is "public") is what distinguishes it from its online predecessors.

Level 1: The application can and does exist successfully offline, but it gains additional features by being online. Writely is a great example. If you want to do collaborative editing, its online component is terrific, but if you want to write alone, as Fallows did, it gives you little benefit (other than availability from computers other than your own.)

Level 0: The application has primarily taken hold online, but it would work just as well offline if you had all the data in a local cache. MapQuest, Yahoo! Local, and Google Maps are all in this category (but mashups like housingmaps.com are at Level 3.) To the extent that online mapping applications harness user contributions, they jump to Level 2.

You'll notice that I didn't mention either Amazon in the hierarchy above. That's because I can't decide whether they belong on level 2 or 3. One can imagine an Amazon-style product catalog offline (for example, in a store), but Amazon is so persistent in harnessing online participation that they have almost managed to transcend the limits of their category. They've also built services, from Associates to S3, that make them completely a network citizen. So call them level 3, and a testament to the power of strategic effort to change the game.

read more

O'Reilly Radar Executive Briefing Session at OSCON

by tim on Jul 13, 2006  By tim

I have a tendency to work on my upcoming speaking engagements as they pop off the queue. Sometimes this is unfortunate, as when there's a need for advance marketing. The O'Reilly Radar Executive Briefing session is a great example. I'd done some early brainstorming with Matt Asay that led to the print marketing piece, but didn't really start to fill in all the details till last week. I'm incredibly excited about the program, but a lot of new details have just gone up on the web when the event is less than two weeks away, making it tough for people who like to plan ahead. Sorry.

I'm focusing the program around the intersection of Open Source and Web 2.0, because I continue to believe that even though Web 2.0 is deeply rooted in open source, it's the open source community that most needs to be reminded of the connection. The world that gave birth to the free and open source software communities has changed radically, and open source needs to change as a result.

I kick off the day with a conversation with Chris DiBona from Google, Jeremy Zawodny from Yahoo!, and Jim Buckmaster from Craigslist, exploring how they use open source in their companies, and how web 2.0 applications harness their users at levels far above the code. I'll also do my best to put them in the hot seat about how they give back to open source. Next, we hear from Bill Hilf about how Microsoft is thinking about open source. Next up: conversations with Jim Buckmaster, Ian Wilkes, and Brian Behlendorf to highlight two of the big ideas that Web 2.0 challenges us with: asymmetric competition and operations as advantage in a world where software is delivered as a service.

Michael Tiemann, Marten Mickos, and David Skok (the VC behind JBoss) will join us to give perspective on how they see open source's competitive position at the tipping point. We're well into the third stage of Gandhi's progression (cited by Eric Raymond in the early days of open source activism): "First they laugh at you. Then they fight you. Then you win." But winning brings its own challenges. I'll press the panel on the question of how, having disrupted the traditional software ecosystem, they might now themselves be disrupted (or continue to be advantaged) by the move to network computing.

After lunch, we'll run a session I'm calling "spotlight," in which 8 open source companies each have ten minutes to wow us with the big ideas behind their projects. These are all companies and people that ought to be on your radar, and I'll do my best to tell you why.

Irwin Gross will talk about new ideas for building a real marketplace for "intellectual property." We'll then talk about open data. (I'm absolutely convinced that the next Richard Stallman is already in our midst. Except that he's going to be leading a campaign to free our data from lock in, rather than our software.) We'll talk to Yahoo! about the first shots that have already been fired in the open data wars.

We finish the day in conversation with Mark Lucovsky, once the architect of Microsoft Hailstorm, now working to build the next generation of open services at Google. Mark tells some amazing stories about just how different it is to develop on the "Google platform" than on the Microsoft platform. And then, we'll hear from that Ginsu knife of open source: Mozilla, home of the most widely used programming language that doesn't get the respect it deserves (JavaScript), the browser that's becoming the new standard (Firefox), and one of the oft-overlooked contenders in the budding infrastructure for the internet as platform. We talk with Mike Schroepfer, the VP of Engineering for Mozilla, about all this and more.

This isn't just the usual suspects. I'm hoping to give a really different perspective on open source. There's a real risk when you're too close to something that you can fail to understand just what's so important about it. It's also going to be an incredibly intense day, with big ideas and important speakers following in rapid succession. I expect my head to explode by the end of the day.

If you can't make the event, but do care about the future of open source, you should at least read the writeup, and think about the ideas and people who are listed there. I'll also give a 15-minute recap of the day on the main stage the next morning.

And, as usual, let me know what I've missed. (It was tough to fit everything in, and I know I left out a lot i'll be kicking myself about tomorrow.)

read more

Looking forward to OSCON

by allison on Jul 07, 2006  By allison

OSCON is just around the corner. I've spent the past few weeks lining up the final details of the program. I'm really happy with it. We've got a good mix of talks from the big open source projects and languages, some hot web technologies, and a few surprises. One of the fun aspects of organizing a conference like this is the chance to throw some real gems into the mix. Here are a couple of talks I'm particularly looking forward to that you might miss if you're not looking closely:

Web Heresies: The Seaside Framework by Avi Bryant - Seaside is an open source web framework written in Smalltalk. Everyone looks up to Rails as an innovative, easy-to-use, rapid-development framework, but the Rails developers look up to Seaside. It's worth listening to the interview with Avi on the Ruby on Rails Podcast from late last year.

Programming the Kernel for Web 2.0
by Audrey Tang - we talk about Web 2.0 as an "operating system", but most of the development energy in Web 2.0 goes into the "desktop applications" for the platform. Audrey has taken some of the best ideas from Rails, Seaside, Springs, and a handful of other frameworks and pushed them to the next level. It's a taste of what we're likely to see more and more of in the coming years as the Web 2.0 "kernel" matures.

read more

The New Hallucinogens

by tim on Jul 07, 2006  By tim

This millenial "New Age" aspect of what we're now calling Web 2.0 was a big feature of Kevin Kelly's August 2005 Wired article, We Are the Web, which provoked Nicholas Carr's stinging rebuttal, The Amorality of Web 2.0. Roger Magoulas, the director of O'Reilly Research, has another take on the same subject. He wrote in email:

"I've been taking care of a neighbor's cat, and while waiting for the cat to return one night I did something I don't normally have time to do - speed remoting through cable tv channels. My attention was drawn to a VH-1 special called The Drug Years - a four hour documentary about drug use and its consequences over the last four or five decades. I was struck by how often the way pundits and folks interviewed used the same adjectives and metaphors to relate their drug experiences as we often hear to used describe the potential of software and the internet: mind expanding / mind blowing; connecting with everyone; global consciousness; new colors and shapes; combining music, colors and movement (i.e., animation); insiders vs. outsiders; the importance of play; ambivalence towards material wealth; etc.
 

Especially the section that focused on the 60's seemed to capture the same utopian euphoria I'm hearing in the current technology environment (and what I heard during the first boom). Besides the use of mind-blowing adjectives, four themes spanned 60's drug taking and the current technology wave: connecting to everyone (in some type of lovefest context); there are people who get it and people who don't; multi-media helps define what's happening; and, that everything will be different now (in an undefined way).

I know John Markoff made a connection between drug use and the original personal computer hackers in What the Dormouse Said: How the 60's Counterculture Shaped the Personal Computer Industry, but I think this is different. Maybe the 60's counterculture never ended, it just went underground only to reemerge as a source of memes for the current technology culture.

So, following my logic, Web 2.0, DIY, open source, blogs, data are the new hallucinogens, only now it's all legal.

P.S. Hope the subject text doesn't trigger any extra scrutiny from the feds - evidence of another similarity between 60's drug culture and 21st century technology, justified paranoia."

While it's easy to use the parallels to dismiss Web 2.0, to do so is in fact to miss the enormous transformative power of the sixties counter-culture. Millennial thinking is always over the top, but the human longing for transformation and transcendence is nonetheless a powerful force for change. Culture moves in a spiral, not a circle; history repeats itself dynamically, not statically.

read more

Operations: The New Secret Sauce

by tim on Jul 10, 2006  By tim

I spoke last week with Debra Chrapaty, the VP of Operations for Windows Live, to explore one of the big ideas I have about Web 2.0, namely that once we move to software as a service, everything we thought we knew about competitive advantage has to be rethought. Operations becomes the elephant in the room. Debra agrees. She's absolutely convinced that what she does is one of the big differentiators for Microsoft going forward. Here are a couple of the most provocative assertions from our conversation:

  1. Being a developer "on someone's platform" may ultimately mean running your app in their data center, not just using their APIs.

  2. Internet-scale applications are pushing the envelope on operational competence, but enterprise-class applications will follow. And here, Microsoft has a key advantage over open source, because the Windows Live team and the Windows Server and tools team work far more closely together than open source projects work with companies like Yahoo!, Amazon, or Google.

Let me expand on these two points. Did you ever see the children's book, Cloudy, With a Chance of Meatballs? I summarize my conversation with Debra as "Cloudy, with a chance of servers." (I started with a pun on the title, but the book description from Amazon is particularly apt: "If food dropped like rain from the sky, wouldn't it be marvelous! Or would it? It could, after all, be messy. And you'd have no choice. What if you didn't like what fell? Or what if too much came? Have you ever thought of what it might be like to be squashed flat by a pancake? " But I digress. Back to Debra.)

People talk about "cloud storage" but Debra points out that that means servers somewhere, hundreds of thousands of them, with good access to power, cooling, and bandwidth. She describes how her "strategic locations group" has a "heatmap" rating locations by their access to all these key limiting factors, and how they are locking up key locations and favorable power and bandwidth deals. And as in other areas of real estate, getting the good locations first can matter a lot. She points out, for example, that her cost of power at her Quincy, WA data center, soon to go online, is 1.9 cents per kwh, versus about 8 cents in CA. And she says, "I've learned that when you multiply a small number by a big number, the small number turns into a big number." Once Web 2.0 becomes the norm, the current demands are only a small foretaste of what's to come. For that matter, even server procurement is "not pretty" and there will be economies of scale that accrue to the big players.

Her belief is that there's going to be a tipping point in Web 2.0 where the operational environment will be a key differentiator. I mentioned the idea that Web 2.0 has been summed up as "Fail Fast, Scale Fast," and she completely agreed. When it hit its growth inflection point, MySpace was adding a million users every four days -- not at all an easy feat. As these massive apps become the norm, unless you can play in a game where services can be highly stable, geodistributed, etc., you won't be in the game. And that's where she came to the idea that being a developer "on someone's platform" may ultimately mean running your app in their data center. Why did Fedex win in package delivery? They locked up the best locations with access to airports, warehousing, etc. so they had the best network. A similar thing will happen with packet delivery.

Who are the competitors of the future in this market? Microsoft, Google, Yahoo! and the telcos were the folks she called out, with a small nod to Amazon's platform aspirations. (Sure enough, in true news from the future style, into my inbox comes Jon Udell's review of a new service called OpenFount: "Openfount's big idea is that a solo developer ought to be able to deploy an AJAX application to the web without worrying about how to scale it out if it becomes popular. If you park the application's HTML, JavaScript, and static data files on Amazon's S3 storage service, you can make all that stuff robustly available at a cost that competes favorably with conventional hosting.")

Debra also talked about the importance of standardization, so that increasing capacity is as seamless as possible. "We've got to make it like air and water." In that regard, another very interesting point we discussed was Ian Wilkes' thought that database tools were still weak when it came to operational provisioning. That was where she came to the second point above, that Microsoft has a key competitive advantage here. Internet-scale applications are really the ones that push the envelope with regard not only to performance but also to deployment and management tools. And the Windows Live team works closely with the Windows Server group to take their bleeding edge learning back into the enterprise products. By contrast, one might ask, where is the similar feedback loop from sites like Google and Yahoo! back into Linux or FreeBSD?

As Shakespeare said, "The game's afoot." Debra put more servers into production in the last quarter than she put in place in all of the previous year, and she thinks this is just the beginning. Operations used to be thought of as boring. It's now ground zero in the computing wars.

read more

Operations: The New Secret Sauce

by tim on Jul 10, 2006  By tim

I spoke last week with Debra Chrapaty, the VP of Operations for Windows Live, to explore one of the big ideas I have about Web 2.0, namely that once we move to software as a service, everything we thought we knew about competitive advantage has to be rethought. Operations becomes the elephant in the room. Debra agrees. She's absolutely convinced that what she does is one of the big differentiators for Microsoft going forward. Here are a couple of the most provocative assertions from our conversation:

  1. Being a developer "on someone's platform" may ultimately mean running your app in their data center, not just using their APIs.

  2. Internet-scale applications are pushing the envelope on operational competence, but enterprise-class applications will follow. And here, Microsoft has a key advantage over open source, because the Windows Live team and the Windows Server and tools team work far more closely together than open source projects work with companies like Yahoo!, Amazon, or Google.

Let me expand on these two points. Did you ever see the children's book, Cloudy, With a Chance of Meatballs? I summarize my conversation with Debra as "Cloudy, with a chance of servers." (I started with a pun on the title, but the book description from Amazon is particularly apt: "If food dropped like rain from the sky, wouldn't it be marvelous! Or would it? It could, after all, be messy. And you'd have no choice. What if you didn't like what fell? Or what if too much came? Have you ever thought of what it might be like to be squashed flat by a pancake? " But I digress. Back to Debra.)

People talk about "cloud storage" but Debra points out that that means servers somewhere, hundreds of thousands of them, with good access to power, cooling, and bandwidth. She describes how her "strategic locations group" has a "heatmap" rating locations by their access to all these key limiting factors, and how they are locking up key locations and favorable power and bandwidth deals. And as in other areas of real estate, getting the good locations first can matter a lot. She points out, for example, that her cost of power at her Quincy, WA data center, soon to go online, is 1.9 cents per kwh, versus about 8 cents in CA. And she says, "I've learned that when you multiply a small number by a big number, the small number turns into a big number." Once Web 2.0 becomes the norm, the current demands are only a small foretaste of what's to come. For that matter, even server procurement is "not pretty" and there will be economies of scale that accrue to the big players.

Her belief is that there's going to be a tipping point in Web 2.0 where the operational environment will be a key differentiator. I mentioned the idea that Web 2.0 has been summed up as "Fail Fast, Scale Fast," and she completely agreed. When it hit its growth inflection point, MySpace was adding a million users every four days -- not at all an easy feat. As these massive apps become the norm, unless you can play in a game where services can be highly stable, geodistributed, etc., you won't be in the game. And that's where she came to the idea that being a developer "on someone's platform" may ultimately mean running your app in their data center. Why did Fedex win in package delivery? They locked up the best locations with access to airports, warehousing, etc. so they had the best network. A similar thing will happen with packet delivery.

Who are the competitors of the future in this market? Microsoft, Google, Yahoo! and the telcos were the folks she called out, with a small nod to Amazon's platform aspirations. (Sure enough, in true news from the future style, into my inbox comes Jon Udell's review of a new service called OpenFount: "Openfount's big idea is that a solo developer ought to be able to deploy an AJAX application to the web without worrying about how to scale it out if it becomes popular. If you park the application's HTML, JavaScript, and static data files on Amazon's S3 storage service, you can make all that stuff robustly available at a cost that competes favorably with conventional hosting.")

Debra also talked about the importance of standardization, so that increasing capacity is as seamless as possible. "We've got to make it like air and water." In that regard, another very interesting point we discussed was Ian Wilkes' thought that database tools were still weak when it came to operational provisioning. That was where she came to the second point above, that Microsoft has a key competitive advantage here. Internet-scale applications are really the ones that push the envelope with regard not only to performance but also to deployment and management tools. And the Windows Live team works closely with the Windows Server group to take their bleeding edge learning back into the enterprise products. By contrast, one might ask, where is the similar feedback loop from sites like Google and Yahoo! back into Linux or FreeBSD?

As Shakespeare said, "The game's afoot." Debra put more servers into production in the last quarter than she put in place in all of the previous year, and she thinks this is just the beginning. Operations used to be thought of as boring. It's now ground zero in the computing wars.

read more

Looking forward to OSCON

by allison on Jul 07, 2006  By allison

OSCON is just around the corner. I've spent the past few weeks lining up the final details of the program. I'm really happy with it. We've got a good mix of talks from the big open source projects and languages, some hot web technologies, and a few surprises. One of the fun aspects of organizing a conference like this is the chance to throw some real gems into the mix. Here are a couple of talks I'm particularly looking forward to that you might miss if you're not looking closely:

Web Heresies: The Seaside Framework by Avi Bryant - Seaside is an open source web framework written in Smalltalk. Everyone looks up to Rails as an innovative, easy-to-use, rapid-development framework, but the Rails developers look up to Seaside. It's worth listening to the interview with Avi on the Ruby on Rails Podcast from late last year.

Programming the Kernel for Web 2.0
by Audrey Tang - we talk about Web 2.0 as an "operating system", but most of the development energy in Web 2.0 goes into the "desktop applications" for the platform. Audrey has taken some of the best ideas from Rails, Seaside, Springs, and a handful of other frameworks and pushed them to the next level. It's a taste of what we're likely to see more and more of in the coming years as the Web 2.0 "kernel" matures.

read more

The New Hallucinogens

by tim on Jul 07, 2006  By tim

This millenial "New Age" aspect of what we're now calling Web 2.0 was a big feature of Kevin Kelly's August 2005 Wired article, We Are the Web, which provoked Nicholas Carr's stinging rebuttal, The Amorality of Web 2.0. Roger Magoulas, the director of O'Reilly Research, has another take on the same subject. He wrote in email:

"I've been taking care of a neighbor's cat, and while waiting for the cat to return one night I did something I don't normally have time to do - speed remoting through cable tv channels. My attention was drawn to a VH-1 special called The Drug Years - a four hour documentary about drug use and its consequences over the last four or five decades. I was struck by how often the way pundits and folks interviewed used the same adjectives and metaphors to relate their drug experiences as we often hear to used describe the potential of software and the internet: mind expanding / mind blowing; connecting with everyone; global consciousness; new colors and shapes; combining music, colors and movement (i.e., animation); insiders vs. outsiders; the importance of play; ambivalence towards material wealth; etc.
 

Especially the section that focused on the 60's seemed to capture the same utopian euphoria I'm hearing in the current technology environment (and what I heard during the first boom). Besides the use of mind-blowing adjectives, four themes spanned 60's drug taking and the current technology wave: connecting to everyone (in some type of lovefest context); there are people who get it and people who don't; multi-media helps define what's happening; and, that everything will be different now (in an undefined way).

I know John Markoff made a connection between drug use and the original personal computer hackers in What the Dormouse Said: How the 60's Counterculture Shaped the Personal Computer Industry, but I think this is different. Maybe the 60's counterculture never ended, it just went underground only to reemerge as a source of memes for the current technology culture.

So, following my logic, Web 2.0, DIY, open source, blogs, data are the new hallucinogens, only now it's all legal.

P.S. Hope the subject text doesn't trigger any extra scrutiny from the feds - evidence of another similarity between 60's drug culture and 21st century technology, justified paranoia."

While it's easy to use the parallels to dismiss Web 2.0, to do so is in fact to miss the enormous transformative power of the sixties counter-culture. Millennial thinking is always over the top, but the human longing for transformation and transcendence is nonetheless a powerful force for change. Culture moves in a spiral, not a circle; history repeats itself dynamically, not statically.

read more

FJAX: Ajax with Flash

by tim on Jun 23, 2006  By tim

O'Reilly editor Brian Sawyer pointed me to this interesting webmonkey article on FJAX, a Flash-powered variant of AJAX.

"Fjax is an alternative method for doing the kind of Web 2.0 builds that are currently done in Ajax. The advantage is that it does it in a fraction of the size, and requires no code forking to work in the different browsers. It's a streamlined way of doing asynchronous content updates with XML...
 

Fjax uses the Flash Player to load a 1 pixel by 1 pixel transparent SWF to simply get XML from the server. Once it has the XML, it parses it into HTML and then lets JavaScript know it's ready. JavaScript then gets the HTML from Flash and DHTMLs it into the web page — it uses JavaScript to write (X)HTML/CSS onto the page.

In the end, Fjax gets XML and delivers HTML. It doesn't collaborate with Ajax. It doesn't need to. It doesn't load data visually into a Flash movie for presentation. It could, but that is not the point. It doesn't generate SWFs or require a server side component. It is its own thing. Oh, and did we mention it's only 65 lines of code? And it's free."

This does sound interesting, since Flash is indeed ubiquitous and powerful. Using the Flash engine to enable portability without requiring too much in the way of changed development practices and tools could be interesting. (Although to be fair, Flash Actionscript is really just Javascript with some extensions, and for many uses, many of the constructs that Flash provides for animation, like the timeline, which was a struggle for many developers, are not needed.)

But does it work? I just went to the Fjax site (using Firefox on a Mac) and got this message (from vbscript no less): Error parsing XML Data Error parsing XML Data Error parsing XML Data [Note: Jay McDonald writes in the comments below that this was an overload on the server host website, and has been fixed. The site now works fine. In any event, not a problem with fjax. The site is quite snappy now.]

Disclosure: I was formerly a director for Macromedia, but since the acquisition by Adobe, I have no relationship to the company beyond those of a publisher on Adobe technologies.

read more

BusinessWeek Podcast: O'Reilly's Guide to Web 2.0

by on Aug 23, 2007  

Rob Hof interviews fearless leader Tim O'Reilly about Web 2.0.

There's no buzzword more popular in tech today than Web 2.0. Conceived during a brainstorming session for what became the Web 2.0 Conference now held annually by O'Reilly Media Inc. and CMP Media, Web 2.0 describes the new online services such as the volunteer-written encyclopedia Wikipedia, Yahoo's Flickr photo-sharing site, online marketplace eBay, and search engine Google. Unlike most of the first generation of Web sites, these services have an innate social component, often "harnessing collective wisdom," as O'Reilly Media CEO Tim O'Reilly puts it. Here, O'Reilly explains what Web 2.0 means for business, as well as what executives should be wary of when they try to use Web 2.0 services in their own companies.

Near the end of the conversation, Tim addresses the recent Web 2.0 service mark issue.

read more

NSA Datamining Social Networking Sites

by tim on Jun 09, 2006  By tim

Richard Forno wrote on Dave Farber's IP List: "New Scientist Magazine has discovered that Pentagon's National Security Agency, which specialises in eavesdropping and code-breaking, is funding research into the mass harvesting of the information that people post about themselves on social networks. And it could harness advances in internet technology - specifically the forthcoming "semantic web" championed by the web standards organisation W3C - to combine data from social networking websites with details such as banking, retail and property records, allowing the NSA to build extensive, all-embracing personal profiles of individuals."

Slashdot picked up the story, so many of you have probably already seen it. However, both the New Scientist story and much of the followup laid a misleading trail in the focus on social networking sites. The point is that one of the side effects of Web 2.0, the internet as platform, is that we all leave tracks in cyberspace, whatever sites we use. The kind of work described in this story is going on not just at the NSA, but in commercial operations doing data mining and profiling for everything from direct sales to political fundraising. And while a lot of this work is going on behind closed doors, some of it is open to the public. Look at fundrace and zillow (and trace an individual from one to the other), and you'll see how the tools of surveillance and privacy invasion are becoming "democratized" (so to speak.)

And this isn't all bad -- the paper that triggered the story was not about datamining MySpace and similar networks at all. It was about looking for conflicts of interest among reviewers of scientific papers, which would be a useful tool indeed, and one very relevant to the story I posted the other day about Nature's experiments with open peer review. (What led New Scientist to the NSA connection was a footnote that identified an NSA group called ARDA as one of the funders of the research.)

Another interesting tidbit to me was the fact that the DOD has renamed that NSA group "The Disruptive Technology Office." I'm sure that they are looking at a lot more than data mining ... as should all of you.

P.S. Slashdot clearly got the story from IP, since it paraphrases Richard rather than the New Scientist story itself, but doesn't credit him as the source. I'd love to see more bloggers using (and acknowledging) mailing lists as their source, because there are a lot of very interesting conversations that happen in other places than blogs...

read more

Web 2.0 Service Mark Controversy (Tim responding this time)

by tim on May 30, 2006  By tim

A lot of people have been waiting for a statement from me (Tim O'Reilly) about the Web 2.0 service mark. I'm back, and here it is. This is a long post, because the issues are complex, and I hope people will read to the end. I'll do my best to set the record straight, and to answer some of the comments and questions that have been flying around. I apologize to Tom for the unnecessary lawyer's letter, and ask that he apologize to me for the way he stirred up the mob. I then explain the back story behind the registration of the Web 2.0 mark, and our current thinking about what to do about it. Because the post is so long, and I don't want to break it in the middle, only this notice about the post appears on the Radar home page. Please click through to read the full posting.

read more

Controversy about our "Web 2.0" service mark

by brady on May 25, 2006  By brady

Sara Winge, our VP of Corporate Communications, asked me to post this:

The blogosphere has been buzzing today about O'Reilly sending a cease-and-desist letter to IT@Cork, demanding (yes, that's the legal term) that they not use "Web 2.0" in the title of their conference. I'd like to give the O'Reilly perspective, and clear up a few things. You'd be hearing from Tim, but he's off the grid, on a (rare) vacation.

O'Reilly and CMP co-produce the Web 2.0 conference. "Web 2.0" was coined when we were brainstorming the concept for the first conference in 2003. As noted in the letter to IT@Cork (sent from CMP's attorney, but with our knowledge and agreement), "CMP has a pending application for registration of Web 2.0 as a service mark, for arranging and conducting live events, namely trade shows, expositions, business conferences and educational conferences in various fields of computers and information technology." To protect the brand we've established with our two Web 2.0 Conferences, we're taking steps to register "Web 2.0" as our service mark, for conferences. It's a pretty standard business practice. Just as O'Reilly couldn't decide to launch a LinuxWorld conference, other event producers can't use "Web 2.0 Conference," the name of our event. In this case, the problem is that it@cork's conference title includes our service mark "Web 2.0," which the law says we must take "reasonable steps" to protect. We've also contacted another group that has announced a "Web 2.0 Conference" in Washington, DC this September.

In retrospect, we wish we'd contacted the IT@Cork folks personally and talked over the issue before sending legal correspondence. In fact, it turns out that they asked Tim to speak at the conference, though our Web 2.0 Conference team didn't know that. We've sent a followup letter to Donagh Kiernan, agreeing that IT@Cork can use the Web 2.0 name this year. While we stand by the principle that we need to protect our "Web 2.0" mark from unauthorized use in the context of conferences, we apologize for the way we initially handled the issue with IT@Cork.

[Added 26 May 2006] A further update is here.

read more

More on our "Web 2.0" service mark

by nat on May 26, 2006  By nat

This is an update to Brady's post from last night. It's from Sara Winge, our VP of communications who's taking point on this until Monday when Tim returns from vacation. Brady and I are posting these, by the way, because Sara's not one of the Radar bloggers and it felt strange to add her only for this.

Donagh Kiernan of IT@Cork (to whom the letter was addressed) graciously talked with me late in the work day on a Friday (Irish time), and we've resolved the service mark issue. O'Reilly and CMP are fine with IT@Cork using "Web 2.0" in the name of their June 8 conference. And I apologized again to Donagh for the tone of our letter, and for that fact that we didn't contact IT@Cork before sending it. That's not the way we want to do business, and as a few of you (OK, more than a few) have noted, it was a mistake.

I'd also like to reiterate that, as Web 2.0 Conference co-chair John Battelle noted, "Remember, Web 2.0 is also about having a business that works. And not protecting your trademarks is simply bad business practice." We're not claiming exclusive use of "Web 2.0" in all contexts. Our service mark applies only to "Web 2.0" when used in the *title* of "live events" such as conferences and tradeshows.

read more

Controversy about our "Web 2.0" service mark[

by brady on May 25, 2006  By brady

Sara Winge, our VP of Corporate Communications, asked me to post this:

The blogosphere has been buzzing today about O'Reilly sending a cease-and-desist letter to IT@Cork, demanding (yes, that's the legal term) that they not use "Web 2.0" in the title of their conference. I'd like to give the O'Reilly perspective, and clear up a few things. You'd be hearing from Tim, but he's off the grid, on a (rare) vacation.

O'Reilly and CMP co-produce the Web 2.0 conference. "Web 2.0" was coined when we were brainstorming the concept for the first conference in 2003. As noted in the letter to IT@Cork (sent from CMP's attorney, but with our knowledge and agreement), "CMP has a pending application for registration of Web 2.0 as a service mark, for arranging and conducting live events, namely trade shows, expositions, business conferences and educational conferences in various fields of computers and information technology." To protect the brand we've established with our two Web 2.0 Conferences, we're taking steps to register "Web 2.0" as our service mark, for conferences. It's a pretty standard business practice. Just as O'Reilly couldn't decide to launch a LinuxWorld conference, other event producers can't use "Web 2.0 Conference," the name of our event. In this case, the problem is that it@cork's conference title includes our service mark "Web 2.0," which the law says we must take "reasonable steps" to protect. We've also contacted another group that has announced a "Web 2.0 Conference" in Washington, DC this September.

In retrospect, we wish we'd contacted the IT@Cork folks personally and talked over the issue before sending legal correspondence. In fact, it turns out that they asked Tim to speak at the conference, though our Web 2.0 Conference team didn't know that. We've sent a followup letter to Donagh Kiernan, agreeing that IT@Cork can use the Web 2.0 name this year. While we stand by the principle that we need to protect our "Web 2.0" mark from unauthorized use in the context of conferences, we apologize for the way we initially handled the issue with IT@Cork.

[Added 26 May 2006] A further update is here.

[Added 27 May 2006 by Marc:] I deleted a comment that insinuated Tim is a child molester.

[Added 1 June 2006 by Brady:] You can see Tim's response to the controversy in this post.

read more

Government Thinking about Web 2.0

by tim on May 25, 2006  By tim

I recently received the following mail from someone who desired to remain anonymous:

"My unit leads the website for a scientific agency of the federal government. We are beginning the process of overhauling our site and have an opportunity to do something more than just a minor update. I've been reading about the principles of Web 2.0 and can imagine a site that leverages our scientific expertise and credibility and engages the public in a manner that allows the cumulative experience of our public and our partners to make the content, experience, and impact grow exponentially over time. The challenge is that as a governmental website, there are significant limits to our ability to collect information from/on users and limits to our ability to allow users to add their own feedback and content without undermining our scientific credibility."

Aside from the fact that the need to remain anonymous is a very bad sign with regard to the possibilities for this agency, what advice can radar readers give? I do note that the USPTO has recently floated a plan to get user input on patent applications, so it's not entirely impossible to move the government into the Web 2.0 era.

My first advice is that it's important to distinguish between collecting private information, and collecting or monitoring aggregate or public information. Declan Butler of Nature is doing a great job mapping H5N1 news reports onto Google Earth, for example. At the end of the day, it's not even about collecting information on your site. The best way to make yourself web 2.0 is actually to expose your data in ways that let other people re-use it. What if your data could be a layer on Google Earth?

In general, Nature is doing a great job of applying Web 2.0 to science, so studying what they are doing would be a great starting point with regard to your worries about credibility.

read more

Building Scalable Websites

by tim on May 22, 2006  By tim Those of you who liked my Database War Stories series of posts are going to love Cal Henderson's new book, Building Scalable Web Sites. Cal is of course the lead web developer for Flickr, and the guy who put fear into the hearts of IT shops everywhere by his reported practice of putting builds of Flickr out to the net every half hour, demonstrating just how different web development can be from traditional software development. In this book, he gives practical advice on everything from web application architecture, to development environments, to performance optimization and database scaling. Not to mention Web services API design. Oh, and how to do it all on the cheap. Definitely one for my Web 2.0 reading list. read more

The Economist: The Enzyme that Won

by on Aug 23, 2007  

It's The Economist's turn to ponder "what exactly is Web 2.0?" gleaned, I suspect, from the Where 2.0 preview for press we held recently that took a little detour into a Web 2.0 discussion.

read more

My Commencement Speech at SIMS

by tim on May 14, 2006  By tim Yesterday, I was asked to give the commencement address for the UC Berkeley School of Information (often known as SIMS, for its old name, the School of Information Management and Systems).
 

I was happy to see that the three masters projects selected for special awards, ibuyright, a cellphone application for retrieving environmental, social, and health information about products, mycroft, which seems to be a kind of distributed version of Amazon's Mechanical Turk, and backchannels, a study of the use of backchannel communication by SIMS students, were all on topics close to the Radar's heart :-) [Correction: when I wrote this entry, I inadvertently omitted mycroft, and said that homeskim, an improved interface for apartment searching, had won the award. Homeskim received an honorable mention.]

I promised to put the text of my speech on the web, so here it is. I tried to weave together the kind of life advice expected at a graduation with some of my thoughts on Web 2.0 in a summary format suitable not just for the students but also the parents and grandparents attending. I hope it's not too uneasy a mixture. read more

My Commencement Speech at SIMS

by tim on May 14, 2006  By tim Yesterday, I was asked to give the commencement address for the UC Berkeley School of Information (often known as SIMS, for its old name, the School of Information Management and Systems).
 

I was happy to see that the three masters projects selected for special awards, ibuyright, a cellphone application for retrieving environmental, social, and health information about products, homeskim, an improved interface for apartment searching, and backchannels, a study of the use of backchannel communication by SIMS students, were all on topics close to the Radar's heart :-)

I promised to put the text of my speech on the web, so here it is. I tried to weave together the kind of life advice expected at a graduation with some of my thoughts on Web 2.0 in a summary format suitable not just for the students but also the parents and grandparents attending. I hope it's not too uneasy a mixture. read more

Behaviorial Economics

by tim on May 09, 2006  By tim

explains why Nat simply wrote: "Economists discover marketing" in a radar backchannel email rather than taking the time to write an entry about The Marketplace of Perceptions, the Harvard Magazine article on behavioral economics he was referencing :-) There's a section in the article about "intertemporal choice", exemplified by "Wimpy, Popeye’s portly friend with a voracious appetite but small exchequer, who made famous the line, 'I’ll gladly pay you Tuesday for a hamburger today.'"

This is only one of the many subjects covered in this fascinating article about how the way issues are framed affects people's choices far more than classical economics would expect.

One paragraph particularly caught my eye, because it echoes the insight from Dan Bricklin's Cornucopia of the Commons that I frequently reference in my talks. Dan argues that part of the genius of Napster was that it made sharing the default, rather than a voluntary contribution.

The article applies this idea to 401(k) plans: “People want to be prudent, they just don’t want to do it right now,” [David] Laibson says. “You’ve got to compel action. Or enroll people automatically.” When he was U.S. Treasury Secretary, Lawrence Summers applied this insight. “We pushed very hard for companies to choose opt-out [automatic enrollment] 401(k)s rather than opt-in [self-enrollment] 401(k)s,” he says. “In classical economics, it doesn’t matter. But large amounts of empirical evidence show that defaults do matter, that people are inertial, and whatever the baseline settings are, they tend to persist.”

It was this idea of the importance of defaults that led me to frame the idea of "the architecture of participation." In discussions with Web 2.0 startups, I tend to hammer on this point: what are you doing to make participation automatic, and sharing as widely as possible the default behavior? (This is one reason Flickr became so successful. While Web 1.0 photo sharing services made sharing with your friends and family as the default, Flickr made "world" the default setting.)

read more

Database War Stories #8: Findory and Amazon

by tim on May 04, 2006  By tim Once Greg Linden had pinged me about BigTable (leading to yesterday's entry), it occurred to me to ask Greg for his own war stories, both at Findory and Amazon. We hear a recurrent theme: the use of flat files and custom file systems. Despite the furor that ensued in the comments when Mark Fletcher and Gabe Rivera noted that they didn't use traditional SQL databases, Web 2.0 applications really do seem to have different demands, especially once they get to scale. But history shows us that custom, home-grown solutions created by alpha geeks point the way to new entrepreneurial opportunities... read more

More Web 2.0 Summit Conference News