Transforming Information Companies

Today’s surprise announcement that the Washington Post Company is selling its flagship newspaper along with several other newspapers to Jeff Bezos is only part of a wider, ongoing story of transformation.  Earlier this year, the company took a sharp turn away from publishing and media with the purchase of Celtic Healthcare, a provider of home healthcare and hospice services, and last month it bought Forney Corporation, a supplier of systems for power and industrial boilers.  These moves follow the sale of Newsweek in 2010, and subsequent divestiture of some smaller publishing properties.  The company still owns Slate, The Root, and Foreign Policy magazine as well as Kaplan, the for-profit education company, but it has clearly decided that its future lies outside of publishing and media.

Other companies have made equally dramatic transformative moves through recent and not-so-recent M&A activity:

  • McGraw-Hill divested its education division in March as part of an effort to re-shape itself around financial information (or so the company’s new name says).
  • Thomson was years ahead of the market in divesting its large newspaper business in the 1990s — a step that laid the foundation for giant moves into financial information and then legal information businesses.  Along the way, Thomson divested its educational publishing business in 2007 and healthcare information business in 2012.
  • LexisNexis’ formed its risk analysis division by purchasing Seisant in 2004, a move that was bolstered by the much larger acquisition of ChoicePoint in 2008.
  • Google paid a whopping $1.65 billion for YouTube in 2006, anticipating the importance of video for both consumer and businesses applications.

Transformational transactions such as these are characterized by their focus on optimizing a company’s whole portfolio by moving into or out of entire businesses.  That distinguishes these transactions from the more routine M&A transactions that companies large and small undertake for a variety of reasons, such as rolling up competitors, adding complementary products and capabilities to existing product lines, and expanding into new markets.  From our experience in helping companies transform themselves, we observe that successful transformations share certain characteristics:

A transformational culture:  Companies that integrate M&A into their ongoing strategic planning are more likely to be more successful in transforming themselves than those that think about M&A primarily as an opportunistic activity.  That means that operating executives, not just a company’s professional strategists, need to think about M&A as much as ongoing operations.

Rigorous analysis:  The foundation of any transformation is portfolio analysis, a systematic review of a company’s business units to assess their future growth prospects, resource requirements, and likely returns.  Portfolio analysis must consider such factors as underlying market growth, competition, and the impact of new technologies.    Since operating executives typically have the keenest understanding of their markets, the fusion of their knowledge with the analytical skills of professional strategists often generates critical insights and decision-making.

Nimbleness:  Successful transformation often result when companies are willing to act early, even paying a premium for an asset before it can be fully justified by financials or divesting before a business shows signs of distress.  Thomson was able to realize an attractive price for its newspaper businesses because it recognized the need to divest well before anyone else, and Google was willing to pay a large multiple for YouTube to secure an important new asset.

Of course, not all M&A-driven transformations work out well.  In the 1990s, Knight Ridder made the wrong bet by divesting its information businesses to focus on newspapers, spending over $1 billion in acquisitions before ultimately unloading the entire company in 2006.

Tagged , , , , , , , , , | Comments Off on Transforming Information Companies

Relationship Science: LinkedIn for the One Percent?

Relationship Science, a new company that bills itself as “the ultimate business development tool,” has been in the news because it recently raised $90 million from a set of marquee investors and, perhaps even more significantly, it is the follow-on act for Neal Goldman, whose first company, Capital IQ, was bought by McGraw-Hill for $200 million in 2004.  RelSci (as the company nicknames itself) helps you find relevant business contacts through people you know.  At first glance, it looks like LinkedIn, but there are some big differences:

  • Focus on the upper crust:  Like LinkedIn, RelSci allows you to see how a target contact or institution may be linked to people you know.  But in contrast to LinkedIn’s 225 million members, RelSci’s database currently contains information on only 2.2 million people that it deems “influential” based on such factors as their executive levels and board memberships in corporations as well as non-profits.
  • A reference database, not a social network:  RelSci is a highly-structured source of relationship information, but offers no social networking.  Unlike LinkedIn, there is no facility for linking or communicating with potential contacts or asking existing contacts to provide introductions to the people they know.  RelSci claims that 40% of the people in its database are not LinkedIn users anyway and therefore are not reachable through online social networks.  There is, however, one social-network-like aspect:  As part of the set-up process, users can upload their contacts from Outlook, LinkedIn, or other sources.  These relationships then get added to the user’s virtual database, enabling the system to include those contacts privately in the pathways that it displays for reaching targeted contacts.
  • Curated content: In an age of user-contributed content, RelSci built and maintains its database the old fashioned way with hundreds of data analysts (mostly in India) using publicly-available sources from the web, company documents, and news articles as well as from commercial databases, including FactSet, LexisNexis, Morningstar, GuideStar, and Noza. (The latter two databases provide information on non-profit organizations, boards, and contributors.)  Not surprisingly, this is the same approach used by Capital IQ in its database of relationships for the institutional investment community.
  • Subscription business model:  LinkedIn uses a freemium model, providing a free service for everyone as well as premium tiers for people who need extra information or functionality.  Advertising is also an important revenue source.  By contrast, RelSci operates entirely on a subscription model which starts at $9,000 annually for a minimum three-seat license.
  • No approval required for links.  LinkedIn’s relationships rely on users agreeing to link to each other, whereas RelSci assigns connections based on known (or implied) relationships from its database.  This more restrictive approach means that RelSci has fewer connections overall and lacks the serendipitous connections that occur in LinkedIn (e.g., friends and neighbors).  At the same time, RelSci’s approach mitigates the noise in social networks from the high number of weak connections that results from “promiscuous connecting.”   It also means that users do not have to do anything in order to start using the product.

In all likelihood, curated products like RelSci will co-exist with social networks like LinkedIn.  Meanwhile, RelSci is off to a fast start.  The company is targeting law firms, accounting firms, consulting firms, and others that do high-level prospecting, and already claims to have 1,000 users from 175 companies since its launch at the end of February.

Tagged , , , , | Comments Off on Relationship Science: LinkedIn for the One Percent?

Firestorm Over Elsevier Acquisition of Mendeley

Elsevier’s long-rumored acquisition of Mendeley was finally announced a week ago, touching off a reaction that demonstrates the ongoing tension between large academic publishers and their users as well as the pitfalls of open, networked communities.  For Elsevier, Mendeley is a strategic gem and well worth the $70 million unofficially-reported price.  Mendeley launched in 2009 by a group of PhD students as an open, cloud-based system for academic researchers to download, annotate, and manage the articles and papers they use for their research and also to manage their references – the articles and papers they cite in their own published works.  Reference management is a fundamental part of the research process and had been served for many years by software products, such as EndNote and Reference Manager, owned by Thomson Reuters.  Mendeley has caught on quickly because its cloud-based architecture combines reference management with social networking. For example, users can share their lists of citations, search for potential collaborators, share their own professional information and papers, and develop applications that tap into the content users store in the system – features that scholars have found very compelling.

Another appeal of Mendeley is that it was designed and run by scholars and wasn’t beholden to any publishers or large corporate interests.  And that’s where the firestorm starts.  Since the announced acquisition, many Mendeley users have spoken out in harsh terms about both Elsevier and Mendeley.  Using terms like “evil empire,” “repulsive,” and “sellout,” many users have taken to blogs and Twitter to denounce the acquisition and many, including some high-profile academics, have said that they will no longer use Mendeley.  For some critics, it’s a matter of principle but some say that they are also concerned about trusting their data (i.e., their personal lists of most-important articles) to Elsevier.

Critics may find it more difficult to leave than their protests suggest.  Mendeley already has over 2 million users in 180 countries and to the extent that these users are taking advantage of the open, social aspects of the platform, it will make it hard for individual users to exile themselves.  Furthermore, Mendeley, which has been built on a freemium model, has been moving from individual memberships to institutional memberships that cover entire universities or at least whole departments.  While it is still early in these efforts, institutional memberships make it harder for individual scholars to opt out when their colleagues are all using a common platform.  Elsevier brings an extensive worldwide institutional salesforce that can help accelerate institutional adoption.  Finally, separatists have few alternatives.  Mendeley’s competitors, such as ResearchGate and Zotero, offer similar functionality, but they have much smaller communities of users and lack Mendeley’s market momentum.  It’s not unprecedented for new online services to overcome the head start of established players (witness Google vs. incumbent search engines), but services that have successfully exploited network effects are much more difficult to overtake.

Tagged , , , , , , | Comments Off on Firestorm Over Elsevier Acquisition of Mendeley

Unlocking New Answers from Content

The annual NFAIS conference is a relatively unknown gem of an event for those of us interested in the future of the information business in the business, professional, and academic markets.  (Full disclosure: I was on the planning committee for this year’s event.)  NFAIS, which actually dates back to the Eisenhower Administration, is still a gathering place for vendors, information scientists, and librarians from major university and government libraries from around the world.  This year’s conference provided solid evidence that the information industry is at an inflection point in being able to unlock new answers from content.  While this may sound like a slogan, it’s revolutionary and real.  The most dramatic examples come from Big Data, applying computational techniques to large sets of content to find relationships and patterns between entities, people, and events.  Google is well-known for using such techniques to continuously improve the relevance of its search, but now these computational approaches are being incorporated into scientific, business, and professional information services.  At NFAIS, for example, Thomson Reuters showed how it has applied such techniques to improving search in WestlawNext, its research platform for lawyers, and in Eikon, its platform for financial professionals (One of the key people behind these breakthroughs is actually a computational biologist.)  Another presenting company, Narrative Science, showed how its software can automatically write news stories from quantitative data, such as baseball box scores and company earnings reports.

This revolution is powered by a variety of tools and techniques, such as data mining, text mining, and visualization, to derive richer metadata – and doing it across larger more diverse content sets.  These advances in turn drive new uses, as illustrated by some other presenters at NFAIS.  Elsevier’s ClinicalKey, for example, unifies Elsevier’s medical content (i.e., books, journals, practice guidelines, patient education, and drug information) under a single taxonomy that relates previously-unrelated elements of content, thereby enabling users to discover relevant information.  Another example is McGraw-Hill Construction, which has applied enhanced tagging to its database of construction reports, traditionally used to help contractors and other players identify potential projects.  With its enhanced tagging, McGraw-Hill has been able to launch new analytical products that use the existing content for new purposes, such as analyzing market trends, identifying the most active developers, and understanding the market shares of construction component manufacturers.

The fact that these advances increasingly fuse structured data and unstructured content distinguishes the current revolution from previous advances, which tended to focus on improving one type of content or the other, but rarely both.  Furthermore, these advances offer vendors external benefits, such as improved information discovery and other user experience improvements, as well as internal benefits, such as better intelligence about how customers actually work and which product enhancements are most worthwhile.

As in many technology revolutions, this one is happening because of the combination of off-the-shelf tools as well as the mainstreaming of knowledge about the techniques for unlocking new value from content.  This revolution is becoming the new normal.

Tagged , , , , , | Comments Off on Unlocking New Answers from Content

Sign of the Times: Thomson Sells Its Law School Publishing Business

In the mid-1990s, Thomson made the prescient decision to divest its sizable newspaper business, which was one of the foundations of its information business.  Thomson again seems to have made the right call with its recent divestiture of its law school publishing business.  Capturing the hearts and minds of law school students was long viewed by Thomson and rival LexisNexis as important in future sales to those same people after they became practicing lawyers.  But times are changing.  Law school applications in the US have fallen nearly 40 percent since 2010, according to the American Bar Association.  At the same time, the legal profession is re-examining its educational system and considering some of the biggest changes in a century, such as reducing in-school education and creating new para-professional roles to handle some legal procedures.  (This move is similar to healthcare’s growing use of nurse practitioners to handle some medical care that used to be the sole responsibility of physicians.)  These changes are likely to result in fewer students for case books, study guides, and other products.  Thomson will still have its WestlawNext online research platform to offer law schools, but Thomson’s move reflects changes in the way that law firms purchase information services, as decision-making shifts over time from individual lawyers to professional managers responsible for running firms as businesses.  These professional managers, including chief financial officers, chief administrative officers, chief operating officers, and librarians, are typically willing to trade off individual lawyers’ preferences for less expensive products (or deep discounts from vendors).  This trend toward professional management has been accelerated by cost-cutting pressure on law firms, which started with the 2007 recession.  Another trend associated with professional management has been the growing importance of systems to help firms manage key business functions, such as client development, case management, and billing.   Through acquisitions, Thomson Legal and LexisNexis have become the largest vendors of these business management systems, thereby reducing the relative importance of their traditional online research services and the influence of individual lawyers who used to select such services.  All of this brings us back to where we started:  that law students’ loyalty has diminishing value in winning future law firm business.

Tagged , , , | Comments Off on Sign of the Times: Thomson Sells Its Law School Publishing Business

Mendeley and the Power of Open Communities

Recent reports about Mendeley show the extent to which its open approach for sharing scholarly research seems to be catching on. The five-year old company, part citation manager, part collaborative research platform, and part social network, now claims to have nearly 2 million users. Mendeley’s origin, and still the heart of its value, is in a desktop tool that helps scholars capture and manage bibliographic citations, a fundamental and tedious element of scholarly research.

It was a latecomer to this market, lagging two popular products, EndNote and RefWorks, but it offered a twist – it was built explicitly around the idea of collaboration with other users. The most direct evidence of this is that Mendely requires users to store their citation information on its servers, which has paved the way for a range of collaborative services. Mendeley’s citation manager has proven to be a Trojan horse, as users have gone on to upload nearly 65 million scientific papers and to participate in a variety of Mendeley’s collaborative services, including sharing their professional profiles, identifying research partners, sharing citations with other researchers, and tracking the readership of posted papers. One of the latest signs of the power of Mendeley’s openness is the growth of third-party apps that tap into Mendeley’s database and perform analyses of the articles and bibliographic citations it contains. The company claims that it now has 240 such applications, whereas Elsevier, the largest commercial publisher, which thus far has controlled access by third-party apps, has around 100 such apps, according to a recent TechCrunch article. Mendeley gets over 100 million data queries per month through such apps, according to its CEO, Victor Henning, in TechCrunch. As science continues to become more inter-disciplinary and collaborative, tools like Mendeley serve as both evidence of and potential drivers of the need for increasing collaboration.

The growth of apps may be one of the most important developments for Mendeley as well as for the field of scholarly research. While social networks for professional users have tended to become more specialized over time, no organization, commercial or nonprofit, can keep up with user demand for specialized applications without opening up its system so that users and interested third parties can do independent application development. Though still young, Mendeley appears to be on the right road.

Tagged , , , , , | Comments Off on Mendeley and the Power of Open Communities

This Database Tells a Story

At their best, databases aren’t just static repositories of information. They are tools that can help tell stories. One of the newest databases, the National Registry of Exonerations, shows just how eye-opening a story data can tell when it is collected and normalized. A joint project of the University of Michigan Law School and the Center on Wrongful Convictions at Northwestern University Law School, the database profiles approximately 900 of 2000 cases in which defendants were convicted and then exonerated in the US since 1989, the year that DNA was first used as evidence. (Another 1100 “group exonerations” that occurred in response to 13 separate police corruption scandals are not included in the registry.)

A report produced from the database by University of Michigan law professor Samuel Gross reveals some of the shocking mistakes that occur in our criminal justice system: Wrongfully-convicted prisoners captured in the database spent an average of 11 years in prison before being exonerated. Perjury and false accusations were the most common causes of a wrongful conviction, accounting for 51 percent of the cases included in the database. Over 14% of the profiled exonerations were of defendants who were convicted of crimes that never happened, and around the same number of exonerations occurred among defendants who confessed to crimes they did not commit; 7% of the exonerations were of innocent defendants who pled guilty. The data also shows that false convictions vary by crime: For murder, the biggest cause of wrongful convictions was perjury, usually by a witness who claimed to have seen the crime or even participated in it. In rape cases, false convictions are almost always based on eyewitness mistakes – more often than not, mistakes by white victims who misidentified black defendants. False convictions for robbery were also almost always caused by eyewitness misidentifications. Child sex abuse exonerations were almost always from fabricated crimes that never occurred.

The registry is probably just scratching the surface, as there are no official reporting mechanisms for exonerations. To remedy that problem, the database producers have asked for help in identifying more exonerations. Each reported case must then be researched and carefully analyzed and categorized. Peter Neufeld, director of the Innocence Project, calls the registry the “Wikipedia of Innocence” and in some ways he right. In just one month since it went public with 873 entries, the database has grown to 912 entries, thanks in part to help from contributors.

Another noteworthy part of the story is the tale of the database itself. Our friend Rob Warden, Executive Director of Northwestern’s Center on Wrongful Convictions, first told us about his vision for this database nearly 10 years ago and has been working on it since then. Bravo, Rob and colleagues!

Comments Off on This Database Tells a Story

Amazon’s Ambitious Agenda

The US Justice Department’s recent antitrust actions against the six largest book publishers for allegedly colluding in setting e-book prices comes as a nice boost for Amazon, whose pricing policies have long worried the publishers. Amazon, of course, doesn’t need any help, given its dominant, 60% share of the e-book market, not to mention a very strong position in print bookselling, and a growing presence in book publishing. These and other aspects of Amazon’s past, present, and potential future impact on the book industry are well-explained in Steve Wasserman’s “Amazon and the Conquest of Publishing” in the June 18 edition of The Nation. Though not necessarily presenting any new information, the article deftly weaves together a range of information and perspectives about Amazon’s multi-pronged moves to dominate book publishing. Wasserman lays out the history of Amazon’s moves, from its earliest days trying to become the world’s biggest bookstore, to its revolutionizing e-book publishing with the introduction of the Kindle in 2007 (21% of Americans have read an e-book in the last year and Amazon now sells more e-books than print books), to its more recent entry into the role of publisher by hiring industry veterans and starting several imprints. Among Amazon’s potentially problematic moves for traditional publishers is Kindle Single, a program that could change time-worn author-publisher economics. Authors can publish short essays at a highly-advantageous 70% royalty, but agree to a retail price of no more than $2.99 and receive no advance. A number of prominent writers have participated in this program, some reportedly earning more money than magazines would have paid. The potential for disruptive impact by extending this model to book publishing seems pretty clear from Wasserman’s narrative. While books are no longer the central focus of Amazon’s business, it can use its vast resources to enter and dominate any aspect of publishing over time (Amazon has $48 billion in revenues, more than the size largest publishing companies combined and a $5 billion cash reserve).

As a contrast to the Wild West of the US book publishing market, the same issue of The Nation also carries a fascinating article, “How Germany Keeps Amazon at Bay and Literary Culture Alive,” explaining how German laws prohibiting discounting have protected publishers and independent booksellers. Statistics here tell an interesting story: Germany publishes four times as many new books per-capita than the US each year, and with a population just one-quarter that of the US, Germany has 80% more independent bookstores. According to the author, former publishing executive and culture minister Michael Naumann, the difference is due to more than Germany being a “kulturnation” of largely well-educated people. The protection of book publishing margins enables publishers to invest in titles that are not likely to make money, such as those of a scholarly or narrow focus. German publishers also benefit by being allowed to write off unsold book inventories, a privilege not allowed by US tax laws. Whereas US antitrust law aims to let competition and innovation proceed as long as it benefits consumers (i.e. in lower book prices), Germany’s laws seek to protect book publishers and sellers under the argument that a stable, healthy publishing industry ultimately guarantees that the public will have access to a wide range of books, regardless of price.

Comments Off on Amazon’s Ambitious Agenda

It’s the Workflow, Stupid

It’s a rare information business these days that doesn’t have a mantra about embedding its services into its customer workflows. Providers of information content, applications, and services rightly recognize that facilitating more of their customers’ critical tasks increases their value and often makes them difficult for customers to replace.  For all its importance, however, information businesses’ actual progress in developing and deploying workflow solutions has been uneven, and few have yet achieved the strategic benefits of workflow solutions.

One of our favorite examples of a successful workflow application comes not from an information company but from a law firm, Littler Mendelson, the largest labor and employment firm in the U.S. with over 850 attorneys.  One of its bread-and-butter services is handling “administrative charges,” employment complaints filed against companies. (There are over 100,000 such claims filed by employees with the federal Equal Employment Opportunity Commission each year and probably an equal number with state agencies.)  Late last year, the firm launched Littler CaseSmart, a workflow solution which has dramatically changed how the firm interacts with its clients, how it delivers its services, and how it charges.  On the surface, the Littler CaseSmart is a technology platform through which it can manage the full lifecycle of a matter, including processes for interacting externally with clients and internally to maintain the firm’s quality assurance controls.  Clients can monitor the progress of individual matters and access statistical reports through a client dashboard.  What underlies the system, however, is just as important as the technology. One of CaseSmart’s key selling points is its ability to deliver work at a predictable fixed-fee per matter.  To make this possible, Littler established a team of flex-time attorneys, employees who wish to work on a part-time basis (typically for work-life balance) and who are paid a fixed fee per matter.

Littler CaseSmart is a compelling case study of how workflow solutions can transform – not just automate – a client relationship.  It offers several lessons for other anyone thinking about deploying workflow solutions:

  • Re-engineer the organization, not just the workflow.  Littler Mendelson could have satisfied itself by simply applying technology to the process of managing cases. Instead it realized that it couldn’t transform its underlying economics and enable the fixed-fee pricing model that drives the whole CaseSmart value proposition unless the firm developed an alternative organizational structure.
  • Change the rules of engagement with clients.  Just at CaseSmart has transformed Littler’s internal operations, it has changed how the firm engages with its clients.  Much of the interaction is now mediated by technology and requires changes in how clients manage their own internal staff and processes.  These types of changes aren’t easy to effect without a compelling value proposition and strong client relationships.
  • Disrupt your own business.  Professional services firms fear nothing more than the demise of the traditional hourly-billing model.  To its credit, Littler recognized that model could ultimately be challenged by competition and client pricing pressure.  In opting to proactively disrupt this model, Littler has been able to implement an alternative model in a manner and on a timetable of its own choosing – while simultaneously putting pressure on competitors who are less prepared to change.
  • Derive extra value from information.  Littler Mendelson falls into the category we call “inadvertent information companies.”  Its business is law, but the capture of statistical information about its cases creates opportunities to deliver new value to its clients in the form of business intelligence.  Previously unavailable, such information is typically highly appreciated by clients and is likely to improve as Littler aggregates more data over time.

There are many different kinds of workflow solutions and all of these lessons won’t necessarily apply to all of them, but they do illustrate the complexities and value of devising workflow strategies that go beyond merely trying to automate a set of tasks.


Comments Off on It’s the Workflow, Stupid

History Ain’t Bunk

Studying history is a humbling experience, especially for those of us who make our living helping clients navigate changing times. History shows us that change was obvious, no matter how inevitable it may appear in hindsight. More specifically, the pace of change can be either explosively fast or so slow as to be unnoticed until it reaches an inflection point. Here are two of our favorite history books that offer relevant, timeless lessons:
The Anatomy of a Business Strategy: Bell, Western Electric and the Origins of the American Telephone Industry” by George D. Smith is a bland title for one of the most important technology stories of the last 200 years. Originally formed to license Bell’s technology to independent manufacturers, the Bell Telephone Company felt compelled within a few years to do its own manufacturing when its licensees failed to produce enough reliable, cost-effective equipment. This decision resulted in the purchase of Western Electric, enabling Bell to secure manufacturing capacity and quality. Far more importantly, this decision led to Bell’s strategy of controlling the leasing of its equipment, a strategy which drove the company’s success for a century. The rise of the Bell Telephone Company into a vertically-integrated telephone monopoly appears pre-ordained in retrospect, but it was hardly so. The company’s success hinged on a few strategic decisions that turned out to be prescient, and a good deal of luck, including Western Union settling various claims against Bell and exiting the telephone business. This book is 25 years old, but timeless as a business case.
Make a list of the most-disruptive technologies of the 20th century and it probably won’t include the shipping container – but it should. “The Box: How the Shipping Container Made the World Smaller and the World Economy Bigger” by Marc Levinson chronicles this complex and unlikely story. Containers had been tried on and off by various individual shipping lines, but it took a trucking magnate, Malcom McLean, to revolutionize sea and land shipping by envisioning the container as a standardized, interchangeable component across both modes of transport. McLean’s ambitious vision took in the whole transportation system from end to end, resulting in radical changes to ports, loading systems, ship design, labor, and even government regulation. In the decade following McLean’s brainchild in the mid-1950s, the shipping container drove down the cost of shipping by 90 percent and enabled companies to put their factories where labor was cheapest, not necessarily near customers or near ports. The shipping container was a key technology driving “inevitable” globalization, but Levinson’s book shows that it took McLean’s combination of vision, ruthless ambition, engineering, salesmanship, and leveraged buy-out pioneering to pull it off.

Tagged , , , , , , | Comments Off on History Ain’t Bunk