Does the ‘Big’ in Big Data Matter?

Does the ‘Big’ in Big Data Matter?

I recently had the opportunity to participate on a panel discussion at the Broadcasting & Cable Advanced Advertising Summit called, “Big Changes From Big Data,” along with big data experts from ABC, 4C Insights, Discovery Communications, and iSpot.tv. In a compelling conversation on a complex topic, one thing remains clear: it’s really not the lack of scale, variety, or richness of data that is holding the industry back, nor is it a result of deficient technology. The heart of the challenge we face is how the premium video economy can align on a shared context around the data and tap into it as an ecosystem. Here’s my current take on the state of big data in the industry.

Balkanization of data

Today, data is siloed not only across the various partners within the ecosystem, but also fragmented by devices and environments within each entity. Sharing of data with the various players is hard due to lack of trust, transparency, and data leakage concerns. Because different partners are trying to overcome these challenges in disparate ways, such as via bi-lateral data sharing or creating local DMPs, we end up with less than optimal outcomes with everyone leaving money on the table.

One thing that can truly benefit the industry as a whole is if we can find the means or mechanisms to share data in a way that respects the rights of each data owner and in a fully trusted and decentralized manner, greasing the data wheels across the entire ecosystem. This would enable advertisers to buy media more effectively across all devices and environments and significantly increase the value of ad inventory for Publishers by orders of magnitude.

Harmonizing data across the ecosystem

Does using specialized data sets create impact at scale? The market tells us that it gets harder to harmonize the interpretation of data, and its value, as it gets more sophisticated. Most of ad spend continues to revolve around simple and easy to measure data points like demo and reach. As a result, the desire to innovate is dramatically reduced both on the Publisher side, around newer types of data, and at the advertiser end, on how they can act on it, since not everyone values or interprets the newer types of data the same way.

One of FreeWheel’s strategic pillars is unifying data sets and currencies to make it easier for the ecosystem to cooperate around, and enable the re-aggregation of scale across all types of data. This creates the ability to transact on multiple data sets, currencies, and audience segments, thus streamlining the associated workflow, and the in-depth insights needed to better understand the audience and add value to inventory.

Bridging the science of data with the art of content + context

How can we map internal organization structures around information flow to create the right blend of deep data science and the essence of content/context that has been built over time? This is critical as it allows both sides to understand and speak each other’s language, as opposed to creating ivory towers of siloed data science practice hidden within the organization.

At FreeWheel, we are testing this hypothesis by embedding our data scientists at the juncture of various functions. This could unleash the power of data to make it work, not only across the targeting lifecycle, but also content execution. And in the process, this structure will uncover those priceless intersections where the ad message, content, and context resonates.

The bottom line is yes, the ‘big’ in big data matters. We believe the ‘big’ in big data is going to trigger another layer of innovation in advanced targeting. FreeWheel’s current focus is to unlock existing barriers for the premium video ecosystem. Stay tuned for future posts diving deeper into the big data conversation.

Reprint from FreeWheel Views



To view or add a comment, sign in

More articles by Sony Joseph

  • Retention is the New UA. But does it pay your bills ?

    “Retention doesn’t pay bills, monetisation is key and you need to make sure the shape of your LTV curve is right for…

    2 Comments
  • The Future of Game Advertising: Powering Revenue Growth with User Privacy

    Game publishers are increasingly plateauing on ad monetization revenue and effectiveness as the world is transitioning…

    2 Comments
  • Race to (X)Change

    As the world of crypto and TradFi blends, and crypto grows in importance as a vehicle for global value transfers, and…

    4 Comments
  • The (other) Great Merge

    If you thought this article would be about Ethereum’s merge from PoW to PoS..

    2 Comments
  • To new adventures - Furthering the promise of crypto

    I'm excited to share that I’m starting a new adventure and have joined Ripple as the VP of Global Payments to further…

    25 Comments
  • IDFA and the future of privacy stack

    As the advertisement world moves to a privacy-first user-centric era, the world of secure computing, blockchain, and…

    4 Comments
  • Blockchain - ‘Fat Protocol’ Inversion ?

    The Fat protocol thesis has to some extent shaped the thinking around value accreditation in the blockchain ecosystem…

    1 Comment
  • To Tezos or not

    At Kalhatti our goal is to create frictionless access to global financial assets. Critical to making it happen is the…

    7 Comments
  • Ethereum Devcon2 and beyond...

    I started my trip to Devcon2 being on the fence with respect to Ethereum as a production ready platform. The initial…

    1 Comment
  • Bitcoin: Stay calm and carry on ..

    The chorus of bitcoin failures continues…. this time triggered by Mike Hearn’s note : https://medium.

Others also viewed

Explore content categories