Tying The Pieces Together

Dec 24, 2007  |  Michael Wurzer

As the MLS industry embarks on a new year, re-focused on data standards and data sharing, we may be able to learn from similar phenomenon in related industries.

Last week, I read a post by Alex Russell called The W3C Can’t Save Us. The post deals with web browser (HTML, CSS, etc.) standards and how they seem stuck and lacking innovation, but the ideas are instructive to those of us in real estate, too:

It’s clear then that vendors in the market are the ones who deploy new technologies which improve the situation. The W3C has the authority to standardize things, but vendors have all the power when it comes to actually making those things available. Ten years ago, we counted on vendors introducing new and awesome things into the wild and then we yelled at them to go standardize. It mostly worked. Today, we yell at the standards bodies to introduce new and awesome things and yell at the browser vendors to implement them, regardless of how unrealistic they may be. It doesn’t work. See the problem?

Applying this to the MLS industry, we may be seeing the same thing, only earlier in the process. Brokers and agents are yelling at MLSs today to standardize and the inflection point I see here is whether, ten years from now, brokers and agents will be yelling at the standards bodies to “introduce new and awesome things” because innovation will have stagnated as the newly centralized databases restrict access to the “standard” approaches that necessarily have limits.

How can we build the standards process and large central repositories so they encourage instead of limit innovation? That’s a central question facing our industry now, and one answer may be in Mr. Russell’s post itself, where he states: “To get a better future, not only do we need a return to ‘the browser wars’, we need to applaud and use the hell out of ‘non-standard’ features until such time as there’s a standard to cover equivalent functionality. Non-standard features are the future, and suggesting that they are somehow ‘bad’ is to work against your own self-interest.”

More instructive thoughts come from Tim O’Reilly in When Markets Collide from his Release 2.0 series (PDF), in which he analyzes how Wall Street’s use of technology may be predictive for Web 2.0’s use of technology, and vice versa. In that paper, O’Reilly states:

Web 2.0 may have begun with decentralization and peer-to-peer architectures, but if Wall Street and Google are guides, it will end with massive, centralized data centers extracting every last drop of performance. This trend is already apparent with the rise of applications based on massive data centers, where everything from performance and scalability to cost advantages will mean that, as Debra Chrapaty, Microsoft’s corporate vice president of global foundation services, once remarked, ‘In the future, being a developer on someone’s platform will mean being hosted on their infrastructure.’

We can already see this same centralization trend happening now in the MLS industry, with the ultimate centralization being proposed by the NAR with its Gateway concept. As Mr. O’Reilly points out, there are advantages to centralization, but there are disadvantages, too. If creation of large repositories (or a repository) is beneficial or inevitable, then it makes sense to pause a moment during the creation to ask how we can protect against calcifying bureaucracy and power resulting from the centralization. These two issues together — standards and centralization — could easily become more of a nightmare than a solution if some forethought isn’t given to ensuring that we’re building a platform for innovation instead of against it.

That power comes from data is again reiterated by Tim O’Reilly in his recent article Google Admits Data is the Intel Inside, which notes that Google is building and acquiring so many disparate applications (mapping, Goog-411, video, etc.) so that they have the data available for building better search. In other words, they have no (or little) intent on making money from these other applications, their objective is to build a bigger, better (proprietary) data store they can leverage to make more money. Back to the “When Markets Collide” paper:

More thought-provoking is the trend of Google and Yahoo to provide more direct results for many common types of data. If the financial markets are any guide, we will see search engines providing direct access to many more data types over time, with search engines increasingly competing with their former customers to be the preferred target for a given search. Beware of relying on a search giant’s API for the existence of your business. . . .

In this insight, we see a controversial but defensible projection in which Web 2.0, born in a vision of openness and sharing, will end with private data pools controlled by large companies and used disproportionately for their own benefit. Network effects driven by user participation lead to increasing returns in the size and value of the databases that are created as a result. As the Web 2.0 platform matures, we expect to see more companies capitalizing on these insights. Information may want to be free, but valuable information, it seems, is, as always, still being hoarded.

So who is going to win and lose in this emerging environment of data sharing and pooling, which will produce consolidation, control and power? As MLSs participate in these initiatives, are they recognizing the transfer of power that’s occurring? Are they getting fair value in return? Will the MLS platforms and standards being built today encourage or inhibit innovation, which only occurs through competition?

So that this post does more than just ask questions, let me suggest that one way to encourage innovation is the distributed repository idea I posed in Regionals, Part II. The idea is that MLSs or others contributing data to the repository should have a mutual right of withdrawing data from the repository, essentially creating the possibility for many regionals or repositories that would then be able to compete with each other, which competition necessarily will produce innovation.

4 Responses to “Tying The Pieces Together”

  1. Provocative. From an MLS Management perspective, however, I am trying to get my arms around how your ideas will result in the governance hurdles which so plague our development as a viable and efficient tool for real estate practitioners. Competition may result in innovation and development of the product, but sometimes it merely results in allocating resources to the building of competitive entities, not more effective product solutions.

  2. Hi, Judith. The difference between what I am proposing and the current state is that, for those participating, there would no longer be an exclusive hold on the data by any single MLS, but rather each MLS would have to compete on other value than data access. If an MLS were offering a sub-par product or otherwise not producing value, members will leave for an MLS that is meeting their needs. The difference between what I am proposing and the Gateway or other massive central repository is that competition at the local level would be created and that competition would provide continual impetus to the central repository to improve as well.

  3. One more thought: The main issue I see is that, in all the regionalization and Gateway discussions, the role of the MLS is being ducked. Proponents of the regional or national gateways want to avoid the question, because, in their mind, the answer is not what the local MLS wants to hear, i.e., the local MLSs should go away, as Gary Thomas says. The local MLSs are going along with the regional gateway proposals to buy some time, hoping this doesn’t mean that they’ll go away. Yet what should be occurring is a serious discussion about the best future that will produce constant innovation for years to come instead of simply falling prey to the easy idea of a central repository that inevitably will calcify and harm the very people it is designed to serve.

  4. I am not sure I’m getting this. At the heart of the MLS is a central database built primarily of information about listed real estate properties, and collected by amateur data collection teams. (And let’s be realistic about this–the data collectors and reporters are basically salespeople, not data people; creative and aggressive people, not meticulous analysts and recorders. ) So the data product is first of all flawed by definition, and secondly limited in scope to that information which is immediately helpful and productive to the salespeople. Realistically, what can be ‘bettered’ in this scenario? Collection and accumulation of data is what it is in an MLS, and I don’t know of anyone who can do much to make the data more extensive, more accurate, and more timely than currently exists.

    So when we speak of “innovation for years to come”, it seems to me that we must first address this issue of the data itself–its accuracy and extensiveness, and usefulness to the target market for this data (‘target market’–another interesting and unresolved question). And then the final question, one which you suggest: is there really a need for a local infrastructure to monitor the data and enforce local business rules in an increasingly regional–even national and international– marketplace?