'Incremental Attribution' is currently rolling out to Meta Ads accounts. (First analysis here: https://lnkd.in/eDiX6eB7) This new opt-in attribution setting optimizes for and reports on incremental conversions. These are conversions that would not have occurred without the ad being shown. How this will optimize ad delivery in the background is probably a black box... Meta refers to 'sophisticated models'. 🧪 However, I think you should consider testing this new option versus optimizing for volume, where your focus is on generating as many attributed conversions as possible. If it really generates incremental conversions, it can steer growth. Meta's 'Conversion lift' test uses a similar concept, where it measures the incremental performance using 'control' and 'test' groups. The test group is exposed to your ads, while the control group is not. The behavior of both groups is then compared to measure incremental impact of your ads on incremental conversions, meaning 'additional' conversions that occurred due to the ads themselves rather than organic traffic or other factors. Do you have access to this new attribution setting yet? You can check this by creating a new 'Sales' campaign and navigating to the ad set level. Right below selecting your conversion event, expand the 'Show more options' section. There you should see the option to select the 'Incremental attribution' setting. Are you going to run a test? --- #MetaAds update spotted by Bram Van der Hallen 👉 Follow for more Meta Ads content.
Marketing Attribution Models
Explore top LinkedIn content from expert professionals.
-
-
Head of Marketing: I'm fixing our attribution by asking customers how they found us. CEO: No you're not. That's not scalable.. Head of Marketing: Neither is last click attribution... Dark social is 70% of our current pipeline and we're pretending it doesn't exist. CEO: What are you on about... what's dark social? Head of Marketing: Every untrackable share that actually drives decisions. Like screenshots shared in Slack, Discord or a text between two VPs. CEO: We can't measure that... and we can't optimize what we can't measure. Head of Marketing: We can't measure it because our attribution model is broken. Our tools only tell us "what did they click?". Instead we should just ask THEM "how did you actually find us?" CEO: And people will just... tell us? Head of Marketing: They already are. I added one question to our onboarding calls: "Walk me through how you discovered us." Turns out our biggest deal this month came from a Reddit thread that got screenshot, shared in a private Discord, then forwarded to their team Slack. CEO: Our attribution says "direct traffic." Head of Marketing: Exactly my point. CEO: The board wants dashboards. Head of Marketing: The board wants revenue. I can show them a spreadsheet of "how we actually found you" from every customer. That's real attribution. CEO: How do we scale this? Head of Marketing: One question in every sales call that gets added to a field in the CRM. Actual insights that marketing can work off of. CEO: If this doesn't work... Head of Marketing: It's already working. We stopped tracking last-click nonsense and started listening. Turns out customers will tell you the truth if you just ask. PS - Dark social drives more B2B pipeline than any channel you're tracking. Thoughts? I'm Chris Cunningham - I run social media at ClickUp. Follow me for more actionable marketing tips & tricks.
-
Marketers are skeptical of attribution models. And honestly, they should be. Most are built on shaky assumptions, like giving all the credit to the last ad someone clicked before buying. Many are black box. I'm always in search of research on better ways to measure marketing's impact. So this week on The Marketing Architects Podcast we covered a study titled, "Bayesian Modeling of Marketing Attribution" by Ritwik Sinha and David Arbour from Adobe Research and Aahlad Manas Puli from NYU. The researchers modeled customer journeys probabilistically, looking at things like ad decay, exposures across different channels, and purchase probability. All of that came together to change the chance of a sale over time. One finding: When users saw more than 20 ads in a short window, the chance of a sale went down. Another takeaway: Search and display ads had extremely short half-lives. Their influence faded fast. The model also assigned strong credit to owned and offline channels, which traditional digital attribution methods often ignore. (❤️📺) The Bayesian model doesn't just assign credit, it gives us a sense of how much a channel mattered, how long its effect lasted, and how confident you should be in the results. Even if your brand isn’t ready to adopt a model like this, it's interesting to learn about. And backs up why it's important to invest in multiple models and perspectives. Links in the comments to listen to the podcast + read the study.
-
Facebook/Meta's default attribution is 7-day click and 1-day view. But there's several reasons you shouldn't have any confidence in 1-day view attribution. Any type of click attribution means that an actual ad click was recorded. This leaves a click id, which can be tracked and verified by 3rd party platforms. Any type of view attribution means that no ad click occurred. This means the ad platform is taking credit for an attributed purchase, even though the ad was only seen but wasn't clicked. Here's why 1-day view can't be verified: ❌ No one gets access to Meta's impressions logs. ❌ No one gets access to Meta's pixel/event data after its been matched to a user. ❌ No one can validate, verify, or check that these impressions had any influence on a purchase. ❌ No one can tell how much of these attributed purchases are based on data modeling (also known as "guessing") versus real data. Lots of marketers defend 1-day view: "but what if someone watches a video, then opens a browser and searches for the product, and then purchases". Yes, that happens sometimes. How often? How much credit should be given? No one knows. 🤷🏻♂️ With 1-day view you are trusting Meta 100% to tell you when ad impressions alone are responsible for purchases without any ability whatsoever to check their work. 1-day view can be useful, but its important to learn how to show it separately from 7-day click so you can get a more accurate picture of your attribution. See page 2 of the attached image to learn how to add an attribution split to your Ads Manager reporting views. Use 1-day view as a reference, but don't put too much trust in it.
-
You walked out of your house, took the bus, walked some more, took the subway, walked some more, and got to the lift. After the lift button is pressed, you took the lift to your floor and then walked to your final destination. But because the lift button was pressed, the person who pressed it for you gets the credit for getting you to your destination. This is exactly how last click attribution works. Similarly, someone sees and likes your LinkedIn post. A week later, they get re-targeted and they also see a friend liking your post. Then he Googles you and clicks on a search ad that leads them to contacting you. Click. Conversion. Was it your Google ad that did it? Perhaps it helped. Somewhat. But the truth is that the person searched for you because your brand was credible and familiar to them. Assuming that was your only conversion, a week later, you look at your search performance and lament that conversions are down. So because it was the one that brought the numbers the last time, you put more money into search in order to drive more conversions. But it’s just like the next time you take the lift, you keep telling people to press the button for you, because it got you to your destination before. You cannot measure conversions this way. You cannot give all the glory to last click attribution like this. Your branding did a lot of the work, your walking and commute, your deliberate actions, got you to your destination, but they don’t get the glory. Last click is still useful - it is simple, auditable and close to revenue - but it doesn’t tell you the full story. If you’re optimizing your budget & business on part of the story, don’t be surprised if performance and/or growth disappoints.
-
Apple's privacy changes killed iOS attribution. But Google just cracked the code! If you are running UAC app campaigns on Google, now you can start tracking conversions on iOS seamlessly. Their new on-device conversion measurement processes user actions directly on the iPhone. No personal data ever leaves the device, but you get the conversion signals you need for optimization. Here's how it actually works: 1/ Two ways to measure - Getting data from user sign-ins (like email and phone numbers) and getting anonymous data from what people do in your app. Google says use both methods together for best results. 2/ Connecting Apple's system to Google Ads - Instead of keeping Apple's tracking separate from your Google campaigns, you link them together so they work as one system. 3/ Combined measurement system - Puts both tracking methods together so you get instant, complete performance data through your tracking tools. This lets you make faster, better decisions instead of waiting weeks for Apple's reports. Advertisers with a majority of logged-in users saw a median 19% CPA reduction on Google's inventory after implementation. But this only works if your app has sign-in functionality for the first-party data variant. For apps without logins, you rely on the deidentified event data approach. The system enhances campaign optimization by boosting conversion observability while maintaining real-time reporting in third-party attribution platforms. Have you tried it yet?
-
Marketers, are you still measuring email the old way? We get told email is dead, but everyone reading this has most likely read an email, logged in using it & made a purchase with it. So it's not dead, but how we judge its effectiveness hasn’t evolved fast. We’ve relied on open rates & click-through rates (CTR) — metrics that, frankly, are no longer fit for purpose. Why open rates are no longer reliable Open tracking depends on image loading, which Outlook often blocks, & Apple & Gmail preload by default. As a result, you might see machines open, not human ones. And proper visibility is vanishing with more “text-only” creatives or image-blocked environments. And CTR? It’s got its own problems Think about user intent. If a customer reads “50% off this weekend” in your subject line, they may just go straight to your site—no click needed. Even Gmail’s AI summarising content & extracting voucher codes means users engage without clicks. Email is quickly becoming a powerhouse for brand awareness, but it doesn't have the metrics to prove this. So, what should we look at? As the rest of adtech races toward incrementality, attention, and post-impression attribution, email needs to catch up. Here’s how: 1. Conversion Attribution (Beyond Last Click) Don't stop at click-based conversions. Track who received the email, & assign influence weightings to openers, clickers, & even non-clickers who later convert. This mirrors how display and social now assess "view-through" impact. 2. Frequency & Multi-Touch Engagement Did the recipient open on mobile in the morning, revisit via desktop, & convert on payday? That’s a multi-touch journey. Look at repeat site visits, device switching, & re-engagement post-send. 3. Pay Day or Trigger-Based Lift Create holdout groups and measure uplift around high-conversion moments (e.g., end-of-month). This mirrors the incrementality testing often used in paid social or programmatic, proving that email drives behaviour, not just volume. 4. Attention Metrics Use tools to estimate dwell time on emails or the time between opening& clicking. These are soft proxies for intent, similar to how platforms measure scroll depth, hover rate, and ad exposure time in other channels. 5. Site Quality Metrics Did email recipients spend longer on site, view more pages, or have higher AOVs? Your session quality tells you if email delivers high-intent traffic, something brands already monitor from Google Ads or affiliates. 6. Ask them! Simple, but powerful: survey your audience. What emails did they find valuable? Did it change their behaviour? Self-reported attribution, done well, can give you what click-tracking can’t. Email deserves more credit than. If adtech is shifting toward attention, incrementality, & deeper behaviour analysis, email should, too. Let's measure actual impact, not just opens & clicks. I bet you will discover that email isn't just for conversion but also a branding-building superpower.
-
New Metric on LinkedIn Ads: “Conversions (Data-Driven Attribution)” If you’re running LinkedIn campaigns, you may have seen this option show up in the performance view: - Conversions (Data-Driven Attribution) (and similarly for Leads). This is still new, and I had to look up some info on it, but here’s what to know: What it is: This metric uses a machine-learning model to allocate credit across multiple ad touchpoints in the buyer’s journey - not just the last click or last view. What’s different / what to watch: It only appears in the performance chart view, so you may not (yet) see full campaign- or ad-level breakdowns. Because it’s giving “true contribution” rather than full touch counts, you might see lower numbers than you’re used to with “Each” or “Last-Touch” models. For longer B2B journeys (multiple ads, impressions, clicks) this gives a more realistic assessment of which ads actually drove conversion. Why you should care: You can now lean in on the metric that better reflects impact, not just activity. It lets your audit/strategy discussions shift from “how many touches” to “which touches were meaningful”. Suggested action: Continue tracking your standard conversion metrics (Each, Last Touch) for context. Add the DDA metric in your reporting and call out the difference: more realistic contribution vs broad exposure. Use this as part of the conversation: “Here’s what the model shows as true performance, here’s what our touch-path exposure looked like, so here’s where optimizations go.” Note the limitation: if you can’t break the DDA metric down by campaign or ad yet, flag that in your audit and plan for when deeper granularity becomes available. - - This is one of several new features I’ve seen rolling out lately, but they also seem to be immature or not ready for full rollout yet. I want to share this in case anyone else has any additional insights they can contribute here, or so you can check it out for yourself if it’s available! #linkedinads #newfeature #b2bmarketing
-
Many traditional methods for measuring the effectiveness of advertising, such as multi-touch attribution (MTA), are becoming less effective due to stricter privacy regulations and the phasing out of third-party cookies. 👉 To fill this gap, geo-tests are reliable tools that can be used to measure "incrementality," which is the core concept in advertising impact assessment. Geo-tests can help answer a fundamental question: How much of your key performance indicators (KPIs) can be attributed to your advertising efforts, and how much would have been achieved without them? With geo-tests, you can differentiate between the audience that was exposed to your ads and the audience that would have acted the same way without them. The best part is that geo-tests can be run on both digital and offline mediums, such as social media, paid ads, TV, radio, OOH, mail, etc. For instance, a women's lifestyle and personal care brand based in Texas was experiencing stagnant growth and struggling to quantify the real impact of its diverse media channels and tactics on the business's bottom line. The brand opted to conduct geo-tests and understand the true revenue drivers for their brand. The result was a 3.1x uplift in their marketing efficiency with Lifesight | Unified Marketing Measurement Platform. #measurement #geoexperiments #marketinganalytics
-
I see this mistake constantly, and it's costing brands money. Teams have built entire strategies around attribution models that don't tell the whole story. Last-click attribution says your paid search is working. Your ROAS looks incredible. The budget flows toward search because the data says it's the top performer. But here's what last-click attribution doesn't show you: the awareness campaigns that put your brand in consideration. The organic content that built trust. The email that reminded someone to convert. Attribution is useful. But it is not true. Brands that rely solely on platform attribution often make three expensive mistakes. First, they underinvest in upper-funnel work. Awareness, consideration, brand building – these don't show up in last-click models. So they get starved of budget. Revenue growth slows. Cost per acquisition rises. And nobody understands why. Second, they overinvest in hyper-targeted short-term activity. Lower-funnel channels look amazing in attribution reports. So the budget concentrates there. But eventually customer acquisition costs rise because you've removed the awareness foundation that feeds the funnel. Third, they miss the long-term compounding effects. Brand signals operate on different timelines than clicks. Mental availability builds over months. Consideration compounds over time. But if you're only measuring last-click, you're optimising away the work that compounds. Here's what actually works: measure both layers. Measure your short-term performance metrics. Conversions, revenue, customer acquisition cost. These matter. But also measure the leading indicators of long-term success. Brand awareness, consideration, customer retention. Measure them separately. Track them over time. Build strategy around both. Stop treating attribution models as complete pictures. Start measuring what they miss. What's your biggest blind spot in how you measure marketing?
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Artificial Intelligence
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development