Raging Regionals

Mar 14, 2007  |  Michael Wurzer

In my last post, I outlined three developing possibilities for toppling the current structure of MLSs in the United States: (1) emergence of one of the new listing aggregators as a national listing service; (2) consolidation of the current 700 or so MLSs into state-wide or other super regional MLSs; and (3) transformation from the anti-trust litigation between the NAR and the US DOJ.

All three of these possibilities are related to each other. If super-regionals are better than the current market definitions, why not just jump straight to a national MLS? Are the regionalization efforts going on right now simply a last gasp effort by the current MLS leadership to ward off the inevitable take-over by one of the national portals? The rise of the new listing portals may be the latest incarnation of the complaints against the MLSs’ attempts to structure rules for distribution of the MLS data. As the new listing aggregators run into these rules, they complain and seek alternative strategies at the same time. At the same time, even while being attacked by the DOJ through the NAR litigation, some of the biggest franchises have decided some of the listing portals are more friend than foe. Perhaps all three of these developments are creating a “perfect storm” of circumstances that will alter the MLS model forever.

While trying to keep the relationship among these issues in mind, I want to focus on each one to discuss the pain points at the heart of each issue and whether the solutions being advocated and pursued are good solutions to those pains. As the title of this post suggests, regionalization is first. I pick regionalization first for several reasons: (1) MLS regionalization is happening right now, it is not speculative in any sense; (2) pros and cons of regionalization likely also apply to a national MLS; and (3) the current formation of regionals provides a good background to the definition of the relevant markets and the influence of MLSs in those markets.

There are several root causes (or pain points) driving the regionalization efforts underway:

  • Brokerage Consolidation – Brokerages are getting bigger and bigger, which means they are covering more and more territory and having to work with many MLSs. In some markets, one brokerage might have to join as many as a dozen or two dozen MLSs. Brokerages spanning multiple MLSs have to enter each listing many, many times to get total exposure, which takes time and money. Single entry of listings should be the goal. That brokers and agents are doing business across many MLSs has changed the game and, with the MLSs still localized, has lead to frustration with the multitude of fees, rules, and data structures of the different MLS organizations and a call for their consolidation.
  • Who Gets the Leads? – Brokerages and franchises of every size are spending lots of money to develop web portals so they can be the first point of contact for the consumer. They want traffic to come to their sites and they need to aggregate listing data to do that. Aggregating listing data across a variety of MLSs is complicated today, because of the lack of standards across the many local MLSs.
  • Web 2.0 Portals – Google, Zillow, Trulia and other new entrants to the listing portal arena raise the possibility of a national system, causing progressive MLS leaders to try to get out in front of that momentum. There is real pain here as consumers can often see listings on these portals from a variety of MLSs when an agent who only belongs to one is limited. This puts the agent in the awkward position of having to use a public portal instead of the MLS to be on an even playing field with their customers.
  • There is little question that there is real pain, unnecessary expense and inefficiency in the current structure of MLSs in some parts of the country. Larger urban areas are the most obvious examples. When there are no physical boundaries or borders, such that cities run one into another, the boundaries of current MLSs become an irrational legacy. Listing aggregation should be easier and less expensive than it is. Data standards should be deeper than they are. MLS rules should be more consistent in substance and application. These appear to be givens.

    What is the solution to these pains? What is the best size for a regional? Who runs it? Who sets the rules? Important to us here at FBS, who provides the software? Many MLSs are grappling with this question right now and using different approaches for consolidation:

  • Merger – In this model, MLSs are merging together so only one entity and one system remains. This approach has the advantage of creating a common data set, common rules, single entry, and single data feeds. Until, of course, the brokerages expand further and cross into areas that haven’t yet been merged. The “right size” question is never-ending. Is the merger activity on-going right now simply headed to a national MLS? There’s a lot of time and money being spent on mergers, but is that just forestalling the inevitable progress to a national system? Ultimately, market forces will determine the boundaries of the MLS and the question is whether the market is moving faster than the MLS or MLS providers.
  • Overlay System – In this model, the localized MLSs remain in tact and each provide a data feed to a separate, typically read-only, system for searching that overlays the existing systems. This approach erases some of the boundaries but these systems are typically limited in the amount of data, particularly with regard to searching, that make the local systems so powerful. Also, with the local systems persisting, the natural inclination for users is to stick with the local system and not use the overlay. This frustrates the goal of single entry, as agents soon learn they still have to enter the listing into the local system to get full exposure. The main advantage of this approach is as a band-aid or a foot in the door to ultimately moving everyone to a single system. It’s a baby step towards merger or acquisition. The question is whether this baby step is worth the cost and energy.
  • Mututal Data Exchange — This approach involves each MLS exchanging data (hopefully using RETS but not always) so that each local system has all the data from each MLS. This approach has the advantage of each MLS being able to retain their independence while also providing the advantages of single entry, single data feeds, consistent (local) rules, and reduced fees. The disadvantage is that each vendor has to deal with a feed from each MLS in the data exchange, which is complicated, time-consuming and costly, given the lack of standards.
  • Data Repository/Independent Front-Ends – In this model, the various MLSs provide a data feed to a “repository” database, or the data is entered directly to the repository, which can then be used for common distribution of data feeds and searching but the local MLSs will retain choice in the “front end” (user interface) they want to offer their members for searching and client management. This is a more sophisticated version of the merger or acquisition approach. The data is still being merged into a single database, the difference is that local MLSs retain choice in the software they use to access the data.
  • Of these approaches, which is best? The easy answer is, “it depends.” It depends on the local market and the market demands. In this fast-moving environment, however, “it depends” is not good enough. There may not be time to allow market forces to provide guidance. In Founders at Work, Charles Geschke of Adobe was interviewed and analogized Adobe’s success to duck hunting, where you need to shoot slightly ahead of the bird (market) if you want to hit it. The MLS bird may have flown away long ago, but I don’t think so. I still believe we can solve the pains of the brokers, agents and the consumers and I believe the answer is in standards.

    Our industry (the NAR, MLS organizations, MLS vendors, IDX vendors and others) has, in fact, anticipated these issues long ago, and has worked very hard on a solution through RETS, the Real Estate Transaction Standard. The RETS working group is now working on version 2.0, which has the potential to revolutionize the way listing data is collected. The industry has often grappled with how to explain the value of RETS to the real estate community and now I believe the value is clear: RETS is a big part of the solution to the salvation of the MLS from otherwise certain death.

    With broader and deeper standard listing definitions, single entry is far easier to accomplish. A listing can be entered in one system and moved more easily to a national repository or to another system. Aggregating the data also becomes dramatically easier, whether for MLS use or for display on listing portals. One of the biggest weaknesses of the merger or overlay approaches discussed above is that you end up with a watered down system, that doesn’t recognize the depth of data necessary to accurately reflect the local market realities.

    We’ve installed over one hundred MLS systems and it never ceases to amaze me how different one MLS data set is from the other. Often, these differences are the result of opinion or local vernacular. In other words, the data isn’t actually different, but it is described and understood by the users differently. In these cases, the data cries out for standards. In other cases, the difference in the data structures is necessary to properly reflect the value drivers of the local real estate. The axiom that real estate is local has real meaning when it comes to data standards. Each community has unique data requirements that have been painstakingly created by the local MLSs and those data structures have enormous value (due respect to the Bloodhound and others who’ve pointed out the value of the deep data in current MLS systems). The RETS standards, however, present an opportunity to maintain the robustness of this local data while also providing deep and broad listing definitions that ease the movement of data to wherever it needs to go (more on this in a few paragraphs).

    The work on the MLS listing payloads (how a listing is defined) is on-going right now. MLS organizations, MLS vendors, at least one listing aggregator, and others will be working to finalize the standards over the next six months. The next meeting is in April and there is work going on right now to create what I like to call a RETSipedia that will expose the current payloads in a format on the web for the domain experts (brokers, agents and MLS executives) to provide comments and feedback. (For those brokers, agents and MLS executives reading this, if you’re not familiar with RETS and believe in saving the MLS, keep an eye on RETS.org, because you’re going to be asked to participate and help define the standards.)

    Whenever I talk with industry veterans about creating agreement on broad and deep data standards, a common response is, “do you remember RIN?” RIN (REALTOR® information network) was an attempt in the mid ’90s to get ahead of the technology movement and create a national listing standard.

    (Side note: I often hear people, including brokers and agents, talk about how slow moving the real estate industry is with regard to technology. This is hogwash. The real estate industry leaders have, for the most part, been pushing technology harder and faster than the technology is growing. The leadership often doesn’t get this credit, but it is deserved. MLS systems were invented by the real estate leadership and those systems have revolutionized the purchase and sale of real estate. Many REALTORS® are huge geeks and gadget freaks and those who paint them as dinosaurs with a broad stroke are mistaken.)

    RIN failed (well, actually, it morphed into Realtor.com) because the process of defining the listings produced tens of thousands of data elements. Literally. As input was collected from all the various MLSs, the different data elements became a huge morass. The challenges of that dialog could not be overcome at that time. The memory of those painful discussions cause industry veterans to cringe at the thought of trying to reach broad agreement. The beauty of the RETS 2 standards, though, is that standardization and customization can be approached at the same time. The X in XML stands for eXtensible. The payloads will remain extensible and can be tailored to local needs. This is a critical requirement to maintain the value of the local data sets.

    At the same time, a common language and understanding must be developed on every element possible. Disagreeing about the definition of a bathroom is no longer acceptable. If we are not the ones to define these standards, who is? Brokers, agents, NAR leadership, MLSs, and MLS vendors have led the real estate industry for many years and we need to continue to do so and recognize that data standards are critical to our future success.

    RETS 1.x has been a success but the success has been limited by the lack of depth to the common names. Light IDX sites can operate okay with the common names but, more often than not, what’s necessary is getting what is known as the compact or compact-decoded format, which is really just a delimited custom data file that requires custom programming. In other words, RETS 1.x provided a common way to retrieve the data but not a standard for what the data would be once it was retrieved. RETS 2 has the potential to change that by providing broad and deep data definitions.

    Once the data is defined, the technology solutions will come so much easier. MLS vendors and other technology firms serving the real estate industry will create amazing things with standard data. Also, with nationwide data standardization driving the change, the need for painful and disruptive mergers of many MLSs may dissipate or at least there will be an alternative strategy. For any MLSs considering the painful process of jamming multiple, disparate data sets together into a single database outside of the RETS process, you should think twice. Instead, participate in RETS, develop the standard with the community, so that when this wave of change passes, we have a true standard and not a continuation of the old disparities, just in larger, more inflexible packages. Big MLSs are only the solution if they come through a national standard.

    Importantly, however, standard data definitions are not the only pain here. The disparate MLS rules, aggregation policies, IDX policies, etc., need to be standardized, too. NAR has been beaten back so many times by the local MLSs on these various issues that they cringe at the thought of trying to “mandate” a policy. Now, with the DOJ jumping down their throat, the NAR is in no position to mandate anything. But standards are needed on MLS policies, too. Perhaps this is where a “state wide” effort makes sense. On the other hand, for data distribution policies (IDX, ILD, etc.), national standards are critical. Slowly but surely, we’re now seeing sold information on broker portals. When will this be a national standard? With the NAR out of the game from the DOJ litigation, the brokers need to step up to the plate and create agreement on these issues. The success of the MLS depends on it.

    Of course, the issue of data distribution policies, while definitely related to the issue of regionalization, is a topic unto itself, involving both the Web 2.0 challengers and the DOJ litigation. So, until next time . . .

    21 Responses to “Raging Regionals”

    1. westside says:

      RETS 2? Why didnt we get a working standard that met the needs of the industry first? Simple listing retrieval? Thats all I ask for.

      Instead, they rolled out a more complex system to integrate many industries. Many of which do not even know that they are supposed to play along. Its kind of non-sense.

      It took them 8 years to half work together a standard, and now they are trying to change it? Trulia did the same thing, better, in less than 4 months.

      Concepts will pass them by before the first server implementation is complete 2 years from now. Time to RE-evaluate.

    2. Trulia has a great system, but an MLS system it is not. The data simply isn’t deep enough. Standards work is hard. I agree that RETS 1.x has limitations, but I believe the industry can still lead. Actions speak louder than words. We’re acting by advocating and helping to define the standards.

    3. westside says:

      ahhhhh. But that is the key here. Its not about replacing the local mls, not really even so much about consolidation! Its about data exchange.

      Certain data should stay and be regulated by the MLS. I really wouldn’t want my listing data to be known publicly that it will be vacant for the next 3 months.

      So yes, Trulia is not an MLS. Nor should any of that data be transfered out of the MLS.

      Getting data to different end points, and not being restrictive or overly complex that is what the focus of this article seemed to me. People tend to blur the lines, and that is where the MLS is doomed debate begins.

    4. westside says:

      in other words, its about ‘publicly viewable listing data exchange’

      Trulia, GoogleBase, PropSmart, etc all do it well, quick efficient and versatile.

      RETS 1.xx as you have pointed out, has its limitations, and RETS 2.xx has jumped the shark and went outside the realm of listings.

      I dont get it, but thats what the RETS site says:

      academia, banks and title?

      Last I checked the finance industry has their own rules to play by.

      shouldnt RETS main focus be on listings and NAR helping their members (which last I checked were REALTORS?), and getting that done correctly?

    5. Robbie says:

      Zillow, Trulia, etc. are great, but they are designed for the needs ot consumers. They also don’t have the burden of backward compatibilty, they don’t have existing MLS boards or data formats to deal with, and they have the luxury of playing with venture capitalist money.

      The MLS system has it’s shortcomings, but it designed for the needs of professionals and it mostly works. I think the industry just needs to wake up and smell the software, and invest more in technology (so their vendors can improve things).

      I think the biggest problem is getting the RETS standard deployed. I think MLS vendors want to support it, IDX vendors need it, and brokers, agents, and consumers would all benefit from the lower cost of software development that it brings.

      The problem is that a standard that isn’t widely deployed isn’t much better than not having standards at all (or having multiple standards).

    6. Robbie says:

      I’m not sure if a national MLS is the answer, but standards for data definition and MLS policies definately are. As a small IDX vendor, the challenge of keeping the entire USA MLS w/ photos on my server would be like sipping from the firehose.

    7. Westside: First, as a new blogger, I really appreciate your comments. Second, yes, I agree, it is about data exchange. Data standards and standards for distributing that data are about improving data exchange, both privately (between MLSs for brokers and agents) and publicly (to consumers). My post on regionalization was focused on the private side, mostly. As you say, the two (private and public) are starting to blend, which complicates matters, but, in some areas, there are very real pains being experienced by the brokers on the private side (duplicate listing entry, duplicate fees, conflicting or at least contrasting rules, etc.) that are driving regionalization separate from the consumer-side issues. I believe the RETS 2 payloads, if made broad and deep, will help address many of these pains by making the data easier to exchange. The issue of standards for distributing listing data publicly is a topic I reserve for a planned future post.

      Regarding the broad scope of RETS, the community is broad. There are lots of participants, including those from the transaction management industry, MISMO (mortgage industry) and others. I don’t believe these efforts have distracted those of us in the community who are focused on listing payloads. And, to the extent the payloads haven’t gotten enough attention, that’s one of the core reasons I’m posting, to focus attention. That being said, there has been so much really excellent work by so many people on RETS 2. Paul Stusiak, Paula O’Brien, Jeff Brush, Mark Lesswing and so many others have gotten the payloads in really great shape already and now we just need a final push from the outside in to help ensure we get them to a point of wide adoption. We need buy in from the broader community as these come to fruition. That’s what I’m promoting.

      Robbie: Thank you for posting. I agree completely with everything you said, especially “The problem is that a standard that isn’t widely deployed isn’t much better than not having standards at all (or having multiple standards).”

    8. Chaz says:

      Everyone brings up great points.

      A little quisp ‘Actions speak louder than words. We’re acting by advocating’

      is not advocating the percieved act of verbally conveying? Of course, you could ‘lead by example’, but lets take a look at that.

      NAR doesn’t even have a means in place to correlate/aggregate the data and provide good analysis of the market themselves. I think at most they tap in a half dozen MLS in major markets to get their ‘percieved’ feeling of state of affairs.

      Should they not have a valid reason to communicate and receive the data from all the MLS? They could then distribute it? In essence, they become the standard method of data normalization as opposed to forcing it across 700+ MLS?

      I kind of agree with @Westside. Why is the focus not more on listings? I read that Charter, and I do not see why some of those sectors (Acadeemia?) are being considered viable end use scenarios of RETS?

    9. Chaz says:

      I would like to point the readers here:

      Its a RETS conversation and it clearly defines the issues at hand. The community sees easier alternatives, and does not understand why the RETS standard is so clunky and tough to implement.

      Much of the technology of RTS does not seem to have justification for its use, as opposed to some ‘experts’ deciding that is the correct way. Unfortunately, although not sure, I dont think any of the committee members have web-services development experience as opposed to just having had ‘industry’ connections????

      Adoption would be much greater if they started simple, and efficient and took baby steps forward rather than jumping in the deepend

      I see more broker blogs talkng about getting thier data out to GoogleBase usng that API as opposed to then turning up RETS.

    10. Chaz, point well taken regarding my use of the term advocating. The issue of Google and RETS is inapposite here. The issue at hand is exchange of data between and among MLS systems. As others have pointed out, Google has an API for pushing a very limited data set to them. That’s the only way to get data to Google so it’s a non-issue. From a comparison perspective, however, RETS may appear more complicated because it does much, much more than push data. More importantly, however, those are issues regarding transport. My main concern here are, as mention in your first comment, the definition of the listings. I’ll leave to others, much more capable than I, discussion of the transport issues. But, I’m fully on board with you that focus on getting deep and broad listings definitions is very important. I think this is mostly important for exchange of data among MLS systems, and I’ll address the issue of public distribution (the real estate advertising market) in a future post.

    11. Jeff Brush says:

      A standard is easy if there is only one implementation. Try to get three separate groups to implement GoogleBase on their own systems and you will have the same problems you see in RETS 1.x. Probablly worse.

      RETS2 specification separates the transport from the data. This creates a more open standard allowing other audiences to add their own payloads. Mortgage (MISMO) payloads might be transfered for example. This does not mean RETS is spending time developing other sector’s payloads, just that RETS2 can carry it.

      The big challenge for RETS is reaching agreement on fields and payloads. A root canal is less painful. Come to the April Meeting in Austin, TX and you can feel my pain.

      Once data is defined someone wrapping a lightweight transport around it can be straight forward. I’ve been looking at Google’s GData API. Just pushing RETS payloads into it should be easy. Getting an MLS to agree to adopt it, less so. If there is an MLS with an interest, “Let’s talk!”

      Using the RETS2 WSDL and modern SOAP toolkits, client developers have build a simple RETS2 client in less than a day. SOAP is used because off-the-shelf toolkits will hopefully support the diverse security requirements that many, vocal, MLSs require.

      I see the divided views like eCommerce. There are business-to-consumer (b2c) and business-to-business (b2b) models. For the MLSs, the b2b-type solution is the one they focus on.

    12. Matt Cohen says:


      You’re right on that standards are key to any of the four approaches you outlined – merger, overlay, mutual data exchange, and data repository / independent front-ends.

      That still leaves the question of approach. I’ve helped facilitate several approaches and, again, you’re on target with the pros and cons you listed. To take it a bit further though, my understanding is that today’s brokers – at least those large enough or geographically positioned in such a way to cross MLS boundaries – want standardized rules, consistent rules enforcement, a single standard data feed, and generally fewer MLSs to deal with.

      The data exchanges and such don’t really end up addressing the rule-oriented problems. Which MLS’s language will you put at the bottom of a listing on your public web site – all of them, when a property is cross-listed? I’m sure you can think of other fun rule and compliance examples, as well as other difficulties when you’re trying to cross-post into systems with different business/data rules.

      Regarding the data repository with independent front-end approach, that works in the short term if all the MLSs involved can pull all the data into their MLS systems, but imagine you’re a very small MLS next to some larger ones and they go this route – you could end up with a very small number of members trying to pay for an MLS front-end that would manage access to a very large number of listings, which may not be feasible. This can lead to the smaller MLSs not being able to provide their subscribers the benefit of the arrangement and result in friction and other problems for the small MLS.

      It’s too early to say that any of the approaches are not viable – but there’s a lot to consider when it comes to choosing a path, especially trying to balance out showing short term progress and achieving a longer-term vision.

    13. Paul says:

      To Westside and Chaz:

      I’m not sure where in Mike’s blog you saw the reference to RETS2, but let me try to answer your questions. I understand your desire to “just get the listings” but it isn’t quite that simple.

      A small set of information of the listing is most valuable when it is used for public marketing of the property, similar to the types of information seen on IDX websites. Much of the remainder of the information – and for many MLS systems, it is around 90%, is governed by the business rules of the MLS, broker or state laws.

      So, given this statement, that there is only a small amount of data that is “public”, how is the remainder handled? RETS1 only covers part of this and is focused on the listing inventory only. RETS2 is attempting to spread this into a broader group.

      Regarding the rets.org website, I’m not sure who wrote the copy on the RETS2 vision – academia definitely doesn’t belong in there. As well, the vision statement and rationale is not exactly right. I’ll try to revise it in the next several days.

      As to the other groups on the list, banking and title, RETS is attempting to keep what the REALTOR does relevent to the whole transaction, not just the internet marketing portion. There are several things at work here.

      First, we did not get as broad an adoption to the concept of data standardization as desired within the listing inventory data in RETS1. This is not the first time that such an attempt has been made. Several reasons can be offered for this. The most important is that the purchasers of the system did not see sufficient value to pay for such a mapping exercise. Many vendors were interested in doing such work, but it is not without cost, sometimes substantial. The owners of the data need to see a compelling reason to make such an investment.

      Second, the RETS1 standard was not well suited to participate in a broader computer driven real estate transaction. Representation, security and tools were not really good enough to get really involved in the bigger picture. Poor data and weak security is not acceptible to many players in the whole transaction.

      Thirdly, there is some additional functions that are having automation applied. Things like “transaction management” and back office solutions are part of this trend. Coupled with this is a level of consolidation where some of the MLS system vendors are buying or being bought. These tend to be within the bigger real estate picture, as opposed to a vendor buying a waste management software company, they are becoming part of a larger entity that may have transaction management, title or many pieces of the real estate transaction.

      These factors and others were considered as well as the state of RETS1 before embarking on the new standard. It is my belief that RETS1 systems will continue to deliver value for many years to come. However, its limitations and baggage make some of the types of interactions possible more difficult.

      Regardless of regionalization pressures, having a standard way to securely transmit information between systems – whether they are MLS systems or transaction systems or MLS and title is very valuable.

      Combining the three point above with a possible regionalization may be sufficient value for the MLS and the MLS software vendors to invest in RETS2.

      I hope this gives you a little context around your question of RETS2.

    14. Ashlonbay says:

      Michael, this comment is very delayed, but I do have some questions and comments I would like to add. In this article you have pointed out the ‘Pains’ as you call it, of the MLS industry. But, has anybody stopped to ask themselves,”why the MLS was created? By who, and for who’s ultimate benefit? I can answer each one of those questions in 2 words – INDUSTRY LEADERS! Here is your quote,(“MLS systems were invented by the real estate leadership and those systems have revolutionized the purchase and sale of real estate.) They have also created complete seperation within the industry where both the consumer and the industry suffers. The MLS was simply created for the wrong reasons, which was to create dependency upon the professional in order to market and sell a property through 1000’s of oney hungry agents trying to outsmart one another. I fully support your notion to create a National database and regionalize that information, but it has to be created for one reason in mind and that is for the Buyer and Sellers benefit.
      Michael, I’m actually very passionate about this topic and have alot more I’d like to say (Off-Blog) so if you have a chance to contact me, I’d be grateful to talk.

    15. Ashlonbay, I’d be happy to speak with you off-blog, but you left no indication of who you are. The pains I was referencing in my post were the pains the brokers are experiencing from the lack of data and rule standards. We have different views of agents, certainly, as those I know (and I know a lot of them) are very hard working people who want to do well by their clients. Just as in any industry, there are agents who do less well for their clients than others, but that’s the nature of competition. The cream does rise to the top, and, in the end, as you say, the buyers and sellers determine who survives by their purchasing decisions.

    16. Ashlonbay says:

      email me back and I will give you a phone number to call.

      Thank you,

    17. […] written previously about the MLS regionalization efforts going on in California and elsewhere, and I wonder how connected those efforts are with NAR’s […]

    18. […] current regionalization efforts should be harmonized with the NAR’s Gateway by focusing on RETS and creating a national […]

    19. […] discussions across the country about consolidating MLSs rages on.   In some areas, however, the focus appears to be more on who leads the consolidation than on […]

    20. […] perfect storm.  I described the perfect storm early on in the FBS Blog as having three fronts: (1) brokerage consolidation pushing MLS consolidation; (2) the web 2.0 movement, both inside and outside of real estate, is engaging consumers like never […]

    21. […] MLSs are the best solution.  Instead, as I’ve long advocated here on the FBS Blog, the best solution is a national data standard that allows for innovation and competition at every level by every MLS, franchise, broker, agent, […]