Aug 16, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Accuracy check  Source

[ AnalyticsWeek BYTES]

>> How can a few snippets of code help you clean up your database for a faster performance? by thomassujain

>> Jul 27, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

>> Top Online Video Analytics Tools That You Should Use by thomassujain

Wanna write? Click Here

[ NEWS BYTES]

>>
 IoT Is Building Higher Levels Of Customer Engagement – Forbes Under  IOT

>>
 Global Hadoop Market Insights and Trends 2016 – 2022 – Paris Ledger Under  Hadoop

>>
 Cloud Security: 3 Identity and Access Management Musts | CSO … – CSO Online Under  Cloud Security

More NEWS ? Click Here

[ FEATURED COURSE]

Baseball Data Wrangling with Vagrant, R, and Retrosheet

image

Analytics with the Chadwick tools, dplyr, and ggplot…. more

[ FEATURED READ]

Introduction to Graph Theory (Dover Books on Mathematics)

image

A stimulating excursion into pure mathematics aimed at “the mathematically traumatized,” but great fun for mathematical hobbyists and serious mathematicians as well. Requiring only high school algebra as mathematical bac… more

[ TIPS & TRICKS OF THE WEEK]

Grow at the speed of collaboration
A research by Cornerstone On Demand pointed out the need for better collaboration within workforce, and data analytics domain is no different. A rapidly changing and growing industry like data analytics is very difficult to catchup by isolated workforce. A good collaborative work-environment facilitate better flow of ideas, improved team dynamics, rapid learning, and increasing ability to cut through the noise. So, embrace collaborative team dynamics.

[ DATA SCIENCE Q&A]

Q:How do you handle missing data? What imputation techniques do you recommend?
A: * If data missing at random: deletion has no bias effect, but decreases the power of the analysis by decreasing the effective sample size
* Recommended: Knn imputation, Gaussian mixture imputation

Source

[ VIDEO OF THE WEEK]

Discussing #InfoSec with @travturn, @hrbrmstr(@rapid7) @thebearconomist(@boozallen) @yaxa_io

 Discussing #InfoSec with @travturn, @hrbrmstr(@rapid7) @thebearconomist(@boozallen) @yaxa_io

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data that is loved tends to survive. – Kurt Bollacker, Data Scientist, Freebase/Infochimps

[ PODCAST OF THE WEEK]

Understanding #BigData #BigOpportunity in Big HR by @MarcRind #FutureOfData #Podcast

 Understanding #BigData #BigOpportunity in Big HR by @MarcRind #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

14.9 percent of marketers polled in Crain’s BtoB Magazine are still wondering ‘What is Big Data?’

Sourced from: Analytics.CLUB #WEB Newsletter

The History and Use of R

Looking for a refresher on R? Checkout the following talk covering in depth about R and answering some of the important question concerning it’s adoption.

A presentation on the history, design, and use of R. The talk will focus on companies that use and support R, use cases, where it is going, competitors, advantages and disadvantages, and resources to learn more about R. Speaker Bio

Joseph Kambourakis has been the Lead Data Science Instructor at EMC for over two years. He has taught in eight countries and been interviewed by Japanese and Saudi Arabian media about his expertise in Data Science. He holds a Bachelors in Electrical and Computer Engineering from Worcester Polytechnic Institute and an MBA from Bentley University with a concentration in Business Analytics.

Sponsor: MediaMath | HackReduce

Video:

Slideshare:

Originally Posted at: The History and Use of R

How big data is transforming the construction industry

Big data analytics is being adopted at a rapid rate across every industry. It enables businesses to manage and analyze vast amounts of data at ultrafast speeds, and obtain valuable insights that can improve their decision-making processes.

One of the industries that are reaping the benefits of this technology is the construction industry. Construction companies are using big data to perform a wide range of tasks, from data management to pre-construction analysis.

Here is a look at how big data is transforming the construction industry…

How Construction Companies are Leveraging Big Data Analytics

Handling Large Amounts of Data

Many construction companies need to juggle many projects at the same time, and they have to collect, produce, organize and analyze a lot of data because of these projects.

Other than creating work reports and progress reports, they also have to manage technical information on various aspects of their projects. All the unstructured data that is collected and generated can burden their databases.

Big data solutions make it possible for construction companies to process massive amounts of data at unprecedented speeds, enabling them to save substantial time and effort, and focus more on the job site instead of IT issues.

Depending on which big data tools they use, they can improve almost every data-related process, from database management to report creation.

According to an article entitled “How Big Data is Transforming the World of Finance“, big data can help businesses create reports on their operations more frequently, or in real time, so that they can make well-informed decisions on a consistent basis.

Predicting Risk

In order to plan and execute projects effectively, construction companies need to be able to predict risks accurately through intelligent use of data.

By implementing big data analytics, they can gain valuable insights that enable them to improve cost certainty, identify and avoid potential problems, and find opportunities for efficiency improvements.

One example of a construction company that is using big data analytics to predict risk is Democrata.

Democrata conducts surveys to gain a better understanding of the impact of new roads, high rail links and other construction projects, and uses big data analytics to perform searches and queries on data sets to obtain insights that can lead to better and faster decision-making.

Solving Problems

The ability to solve problems quickly can contribute significantly to the successful completion of construction projects.

Liberty Building Forensics Group is a company that investigates and solves construction and design problems, and it has provided consultation on over 500 projects worldwide, including a Walt Disney project.

According to the company, forensic issues usually occur in major construction projects, and they can cause big problems, such as failure to meet deadlines, if they are not properly assessed.

In order to fix forensic issues efficiently, construction companies have to be able to collect the right data in an organized way and make the data accessible to the right people at the right time. This can be achieved through the implementation of big data solutions.

Presently, big data analytics is relatively immature in terms of functionality and robustness.

As it continues to become more advanced, it will be more widely adopted in the construction industry.

John McMalcolm is a freelance writer who writes on a wide range of subjects, from social media marketing to technology.

Originally posted via “How big data is transforming the construction industry”

Originally Posted at: How big data is transforming the construction industry

Aug 09, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Conditional Risk  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> 2018 Trends in Artificial Intelligence: Beyond Machine Learning for Internal and External Personalization by jelaniharper

>> Customer Loyalty Resource for Customer Experience Professionals by bobehayes

>> Estimating Other “Likelihood to Recommend” Metrics from Your Net Promoter Score (NPS) by bobehayes

Wanna write? Click Here

[ NEWS BYTES]

>>
 Goldman Sachs enlists staff for cyber security war games – Financial Times Under  cyber security

>>
 DECKER NAMED TO GOOGLE CLOUD ACADEMIC ALL-DISTRICT® FIRST TEAM – Dominican College Athletics Under  Cloud

>>
 Kadant Inc (NYSE:KAI) Institutional Investor Sentiment Analysis – Frisco Fastball Under  Sentiment Analysis

More NEWS ? Click Here

[ FEATURED COURSE]

Machine Learning

image

6.867 is an introductory course on machine learning which gives an overview of many concepts, techniques, and algorithms in machine learning, beginning with topics such as classification and linear regression and ending … more

[ FEATURED READ]

Antifragile: Things That Gain from Disorder

image

Antifragile is a standalone book in Nassim Nicholas Taleb’s landmark Incerto series, an investigation of opacity, luck, uncertainty, probability, human error, risk, and decision-making in a world we don’t understand. The… more

[ TIPS & TRICKS OF THE WEEK]

Data Analytics Success Starts with Empowerment
Being Data Driven is not as much of a tech challenge as it is an adoption challenge. Adoption has it’s root in cultural DNA of any organization. Great data driven organizations rungs the data driven culture into the corporate DNA. A culture of connection, interactions, sharing and collaboration is what it takes to be data driven. Its about being empowered more than its about being educated.

[ DATA SCIENCE Q&A]

Q:Compare R and Python
A: R
– Focuses on better, user friendly data analysis, statistics and graphical models
– The closer you are to statistics, data science and research, the more you might prefer R
– Statistical models can be written with only a few lines in R
– The same piece of functionality can be written in several ways in R
– Mainly used for standalone computing or analysis on individual servers
– Large number of packages, for anything!

Python
– Used by programmers that want to delve into data science
– The closer you are working in an engineering environment, the more you might prefer Python
– Coding and debugging is easier mainly because of the nice syntax
– Any piece of functionality is always written the same way in Python
– When data analysis needs to be implemented with web apps
– Good tool to implement algorithms for production use

Source

[ VIDEO OF THE WEEK]

Making sense of unstructured data by turning strings into things

 Making sense of unstructured data by turning strings into things

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data are becoming the new raw material of business. – Craig Mundie

[ PODCAST OF THE WEEK]

Understanding #BigData #BigOpportunity in Big HR by @MarcRind #FutureOfData #Podcast

 Understanding #BigData #BigOpportunity in Big HR by @MarcRind #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

By 2020, at least a third of all data will pass through the cloud (a network of servers connected over the Internet).

Sourced from: Analytics.CLUB #WEB Newsletter

Three final talent tips: how to hire data scientists

This last post focuses on less tangible aspects, related to curiosity, clarity about what kind of data scientist you need, and having appropriate expectations when you hire.

8. Look for people with curiosity and a desire to solve problems

Radhika Kulkarni, PhD in Operations Research, Cornell University, teaching calculus as a grad student.

 

 

 

 

 

 

 

As I  blogged previously, Greta Roberts of Talent Analytics will tell you that the top traits to look for when hiring analytical talent are curiosity, creativity, and discipline, based on a study her organization did of data scientists. It is important to discover if your candidates have these traits, because they are necessary elements to find a practical solution and separate candidates from those who may get lost in theory. My boss Radhika Kulkarni, the VP of Advanced Analytics R&D at SAS, self-identified this pattern when she arrived at Cornell to pursue a PhD in math. This realization prompted her to switch to operations research, which she felt would allow her to pursue investigating practical solutions to problems, which she preferred to more theoretical research.

That passion continues today, as you can hear Radhika describe in this video on moving the world with advanced analytics. She says “We are not creating algorithms in an ivory tower and throwing it over the fence and expecting that somebody will use it someday. We actually want to build these methods, these new procedures and functionality to solve our customers’ problems.” This kind of practicality is another key trait to evaluate in your job candidates, in order to avoid the pitfall of hires who are obsessed with finding the “perfect” solution. Often, as Voltaire observed, “Perfect is the enemy of good.” Many leaders of analytical teams struggle with data scientists who haven’t yet learned this lesson. Beating a good model to death for that last bit of lift leads to diminishing returns, something few organizations can afford in an ever-more competitive environment. As an executive customer recently commented during the SAS Analytics Customer Advisory Board meeting, there is an “ongoing imperative to speed up that leads to a bias toward action over analysis. 80% is good enough.”

9. Think about what kind of data scientist you need

Ken Sanford, PhD in Economics, University of Kentucky, speaking about how economists make great data scientists at the 2014 National Association of Business Economists Annual Meeting. (Photo courtesy of NABE)

Ken Sanford describes himself as a talking geek, because he likes public speaking. And he’s good at it. But not all data scientists share his passion and talent for communication. This preference may or may not matter, depending on the requirements of the role. As this Harvard Business Review blog post points out, the output of some data scientists will be to other data scientists or to machines. If that is the case, you may not care if the data scientist you hire can speak well or explain technical concepts to business people. In a large organization or one with a deep specialization, you may just need a machine learning geek and not a talking one! But many organizations don’t have that luxury. They need their data scientists to be able to communicate their results to broader audiences. If this latter scenario sounds like your world, then look for someone with at least the interest and aptitude, if not yet fully developed, to explain technical concepts to non-technical audiences. Training and experience can work wonders to polish the skills of someone with the raw talent to communicate, but don’t assume that all your hires must have this skill.

10. Don’t expect your unicorns to grow their horns overnight

Annelies Tjetjep, M.Sc., Mathematical Statistics and Probability from the University of Sydney, eating frozen yogurt.

Annie Tjetjep relates development for data scientists to frozen yogurt, an analogy that illustrates how she shines as a quirky and creative thinker, in addition to working as an analytical consultant for SAS Australia. She regularly encounters customers looking for data scientists who have only chosen the title, without additional definition. She explains: “…potential employers who abide by the standard definitions of what a ‘data scientist’ is (basically equality on all dimensions) usually go into extended recruitment periods and almost always end up somewhat disappointed – whether immediately because they have to compromise on their vision or later on because they find the recruit to not be a good team player….We always talk in dimensions and checklists but has anyone thought of it as a cycle? Everyone enters the cycle at one dimension that they’re innately strongest or trained for and further develop skills of the other dimensions as they progress through the cycle – like frozen yoghurt swirling and building in a cup…. Maybe this story sounds familiar… An educated statistician who picks up the programming then creativity (which I call confidence), which improves modelling, then business that then improves modelling and creativity, then communication that then improves modelling, creativity, business and programming, but then chooses to focus on communication, business, programming and/or modelling – none of which can be done credibly in Analytics without having the other dimensions. The strengths in the dimensions were never equally strong at any given time except when they knew nothing or a bit of everything – neither option being very effective – who would want one layer of froyo? People evolve unequally and it takes time to develop all skills and even once you develop them you may choose not to actively retain all of them.”

So perhaps you hire someone with their first layer of froyo in place and expect them to add layers over time. In other words, don’t expect your data scientists to grow their unicorn horns overnight. You can build a great team if they have time to develop as Annie describes, but it is all about having appropriate expectations from the beginning.

To learn more, check out this series from SAS on data scientists, where you can read Patrick Hall’s post on the importance of keeping the science in data science, interviews with data scientists, and more.

And if you want to check out what a talking geek sounds like, Ken will be speaking at a National Association of Business Economists event next week in Boston – Big Data Analytics at Work: New Tools for Corporate and Industry Economics. He’ll share the stage with another talking geek, Patrick Hall, a SAS unicorn I wrote about it in my first post.

To read the original article on SAS, click here.

Source

Aligning Sales Talent to Drive YOUR Business Goals

5steps_analytics
A confluence of new capabilities is creating an innovative, more precise approach to performance improvement. New approaches include advanced analytics, refined sales competency and behavioral models, adaptive learning, and multiple forms of technology enablement. In a prior post (The Myth of the Ideal Sales Profile) we explored an emerging new paradigm that is disrupting traditional thinking with respect to best practices: the world according to YOU.

However, with only 17% of sales organizations leveraging sales talent analytics (TDWI Research), it seems that most CSO’s and their HR business partners are gambling — using intuition as the basis for making substantial investments in sales development initiatives. If the gamble doesn’t pay off, then the investment is wasted.

Is your sales talent aligned to your company’s strategy of increasing revenue? According to the Conference Board, 73% of CEO’s say no. This lack of alignment is the main reason why 86% of CSO’s expect to miss their 2015 revenue targets (CSO Insights). The ability to properly align your sales talent to your company’s business goals is the difference between being in the 86% or the 14%.

What Happens When You Assume?

Historically, sales and Human Resource leaders based sales talent alignment decisions — both development of the existing team and acquisition of future talent — on assumptions and somewhat subjective data.

Common practices include:

  • Polling the field to determine the focus for sales training
  • Hiring sales talent based largely on the subjective opinion of interviewers
  • Defining your “ideal seller profile” based on the guidance of industry pundits
  • Making a hiring decision based on the fact that the candidate made Achiever’s Club 3 of the last 5 years at their previous company
  • Deploying a sales training program based on what a colleague did at their last company

Aligning sales talent based on any of the above is likely to land your company in the 86% because these approaches fail far more times than they succeed. They fail to consider the many cause-and-effect elements that impact success in your company, in your markets, for your products, and for your customers. As proof of their low success rate, a groundbreaking study by ES Research found that 90% of sales training [development initiatives] had no lasting impact after 120 days. And the news isn’t any better when it comes to sales talent acquisition; Accenture reports that the average ramp-up time for new reps is 7-12 months.

Defining YOUR Ideal Seller Profile(s)

So how does your organization begin to apply the “new way” (see illustration below) as an approach to optimize sales performance? It begins with zeroing in on the capabilities of your salespeople that align most closely to the specific goals of your business. In essence, it means understanding what the YOUR ideal seller profiles are.

Applying the new way begins with specific business goals of your company. What if market share growth was the preeminent strategic goal for your organization? Would it not be extremely valuable to understand which sales competencies were most likely to impact that aspect of your corporate strategy? The obvious answer is yes; and the obvious question is how align and optimize sales to drive increased market share?

How does a CSO identify where to target development in order to have the biggest impact on business results?

By using facts as the basis for these substantial investments. Obtaining facts requires several essential ingredients. The first is a rigorous, comprehensive model for sales competencies; that is, a well-defined model of “what good looks like” for a broad range of sales competencies. This model can be adapted for a specific selling organization, and provides the baseline sales-specific assessments (personality, knowledge, cognitive ability, behavior, etc.).

Then, by applying advanced analytics, including Structural Equations Modeling (SEM) – we can begin to identify cause-effect relationships between specific competencies and the metrics and goals of YOUR organization. With SEM, CSO’s can statistically identify the knowledge and behavior that set top-performers apart from the rest of their team. With this valuable insight, the organization can now align both talent development and acquisition to the company’s most important business goals.

Sales Talent Analytics Provide Proof

Times have changed. The days of aligning sales talent based on gut feel, assumptions or generally accepted best-practices are over. By leveraging sales talent analytics, today’s sales leader can apply a proven 3-step approach to stop gambling and get the facts to statistically pinpoint where to focus development of the sales team, quantifiably measure the business impact / ROI of that development, and improve the quality of new hires. But buyer beware; not all analytical approaches are equal. The vast majority leverage correlation-based analytics which can lead to erroneous conclusions.

By the way we’re not eschewing well designed research that provides insights into broader application of best practices. Aberdeen Group found that best-in-class sales teams that leverage data and analytics increased team quota attainment 12.3% YOY (vs. 1% for an average company) and increased average deal size 8% YOY (vs. 0.8%)

It’s time to define the ideal seller profile for YOUR company. In our next post in this series, we answer the question – how do we capitalize on that understanding to drive the highest impact on our business goals?

Source: Aligning Sales Talent to Drive YOUR Business Goals by analyticsweekpick

Aug 02, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Pacman  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Better than Master Data Management: Building the Ultimate Customer 360 with Artificial Intelligence by jelaniharper

>> Eradicating Silos Forever with Linked Enterprise Data by jelaniharper

>> December 26, 2016 Health and Biotech analytics news roundup by pstein

Wanna write? Click Here

[ NEWS BYTES]

>>
 Collision Course: Foreign Influence Operations, Data Security and Privacy – Lexology Under  Data Security

>>
 euNetworks brings data center interconnect services to Dublin and Hilversum – LightWave Online Under  Data Center

>>
 Syllabus for a course on Data Science Ethics – Boing Boing Under  Data Science

More NEWS ? Click Here

[ FEATURED COURSE]

Pattern Discovery in Data Mining

image

Learn the general concepts of data mining along with basic methodologies and applications. Then dive into one subfield in data mining: pattern discovery. Learn in-depth concepts, methods, and applications of pattern disc… more

[ FEATURED READ]

Big Data: A Revolution That Will Transform How We Live, Work, and Think

image

“Illuminating and very timely . . . a fascinating — and sometimes alarming — survey of big data’s growing effect on just about everything: business, government, science and medicine, privacy, and even on the way we think… more

[ TIPS & TRICKS OF THE WEEK]

Grow at the speed of collaboration
A research by Cornerstone On Demand pointed out the need for better collaboration within workforce, and data analytics domain is no different. A rapidly changing and growing industry like data analytics is very difficult to catchup by isolated workforce. A good collaborative work-environment facilitate better flow of ideas, improved team dynamics, rapid learning, and increasing ability to cut through the noise. So, embrace collaborative team dynamics.

[ DATA SCIENCE Q&A]

Q:What is the maximal margin classifier? How this margin can be achieved?
A: * When the data can be perfectly separated using a hyperplane, there actually exists an infinite number of these hyperplanes
* Intuition: a hyperplane can usually be shifted a tiny bit up, or down, or rotated, without coming into contact with any of the observations
* Large margin classifier: choosing the hyperplance that is farthest from the training observations
* This margin can be achieved using support vectors

Source

[ VIDEO OF THE WEEK]

@chrisbishop on futurist's lens on #JobsOfFuture #FutureofWork #JobsOfFuture #Podcast

 @chrisbishop on futurist’s lens on #JobsOfFuture #FutureofWork #JobsOfFuture #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

It’s easy to lie with statistics. It’s hard to tell the truth without statistics. – Andrejs Dunkels

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Joe DeCosmo, @Enova

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Joe DeCosmo, @Enova

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

94% of Hadoop users perform analytics on large volumes of data not possible before; 88% analyze data in greater detail; while 82% can now retain more of their data.

Sourced from: Analytics.CLUB #WEB Newsletter

May 8, 2017 Health and Biotech analytics news roundup

HHS’s Price Affirms Commitment to Health Data Innovation: Secretary Price emphasized the need to decrease the burden on physicians.

Mayo Clinic uses analytics to optimize laboratory testing: The company Viewics makes software for the facility, which uses it to look for patterns and increase efficiency.

Nearly 10,000 Global Problem Solvers Yield Winning Formulas to Improve Detection of Lung Cancer in Third Annual Data Science Bowl: The winners of the competition, which challenged contestants to accurately diagnose lung scans, were announced.

Gene sequencing at Yale finding personalized root of disease; new center opens in West Haven: The Center for Genomic Analysis at Yale opened and is intended to help diagnose patients.

Source

7 Limitations Of Big Data In Marketing Analytics

Big data — the cutting edge of modern marketing or an overhyped buzzword? Columnist Kohki Yamaguchi dives in to some of the limitations of user-centered data.

As everyone knows, “big data” is all the rage in digital marketing nowadays. Marketing organizations across the globe are trying to find ways to collect and analyze user-level or touchpoint-level data in order to uncover insights about how marketing activity affects consumer purchase decisions and drives loyalty.

In fact, the buzz around big data in marketing has risen to the point where one could easily get the illusion that utilizing user-level data is synonymous with modern marketing.

This is far from the truth. Case in point, Gartner’s hype cycle as of last August placed “big data” for digital marketing near the apex of inflated expectations, about to descend into the trough of disillusionment.

It is important for marketers and marketing analysts to understand that user-level data is not the end-all be-all of marketing: as with any type of data, it is suitable for some applications and analyses but unsuitable for others.

Following is a list describing some of the limitations of user-level data and the implications for marketing analytics.

1. User Data Is Fundamentally Biased

The user-level data that marketers have access to is only of individuals who have visited your owned digital properties or viewed your online ads, which is typically not representative of the total target consumer base.

Even within the pool of trackable cookies, the accuracy of the customer journey is dubious: many consumers now operate across devices, and it is impossible to tell for any given touchpoint sequence how fragmented the path actually is. Furthermore, those that operate across multiple devices is likely to be from a different demographic compared to those who only use a single device, and so on.

User-level data is far from being accurate or complete, which means that there is inherent danger in assuming that insights from user-level data applies to your consumer base at large.

2. User-Level Execution Only Exists In Select Channels

Certain marketing channels are well suited for applying user-level data: website personalization, email automation, dynamic creatives, and RTB spring to mind.

In many channels however, it is difficult or impossible to apply user data directly to execution except via segment-level aggregation and whatever other targeting information is provided by the platform or publisher. Social channels, paid search, and even most programmatic display is based on segment-level or attribute-level targeting at best. For offline channels and premium display, user-level data cannot be applied to execution at all.

3. User-Level Results Cannot Be Presented Directly

More accurately, it can be presented via a few visualizations such as a flow diagram, but these tend to be incomprehensible to all but domain experts. This means that user-level data needs to be aggregated up to a daily segment-level or property-level at the very least in order for the results to be consumable at large.

4. User-Level Algorithms Have Difficulty Answering “Why”

Largely speaking, there are only two ways to analyze user-level data: one is to aggregate it into a “smaller” data set in some way and then apply statistical or heuristic analysis; the other is to analyze the data set directly using algorithmic methods.

Both can result in predictions and recommendations (e.g. move spend from campaign A to B), but algorithmic analyses tend to have difficulty answering “why” questions (e.g. why should we move spend) in a manner comprehensible to the average marketer. Certain types of algorithms such as neural networks are black boxes even to the data scientists who designed it. Which leads to the next limitation:

5. User Data Is Not Suited For Producing Learnings

This will probably strike you as counter-intuitive. Big data = big insights = big learnings, right?

Wrong! For example, let’s say you apply big data to personalize your website, increasing overall conversion rates by 20%. While certainly a fantastic result, the only learning you get from the exercise is that you should indeed personalize your website. While this result certainly raises the bar on marketing, but it does nothing to raise the bar formarketers.

Actionable learnings that require user-level data – for instance, applying a look-alike model to discover previously untapped customer segments – are relatively few and far in between, and require tons of effort to uncover. Boring, ol’ small data remains far more efficient at producing practical real-world learnings that you can apply to execution today.

6. User-Level Data Is Subject To More Noise

If you have analyzed regular daily time series data, you know that a single outlier can completely throw off analysis results. The situation is similar with user-level data, but worse.

In analyzing touchpoint data, you will run into situations where, for example, a particular cookie received – for whatever reason – a hundred display impressions in a row from the same website within an hour (happens much more often than you might think). Should this be treated as a hundred impressions or just one, and how will it affect your analysis results?

Even more so than “smaller” data, user-level data tends to be filled with so much noise and potentially misleading artifacts, that it can take forever just to clean up the data set in order to get reasonably accurate results.

7. User Data Is Not Easily Accessible Or Transferable

Because of security concerns, user data cannot be made accessible to just anyone, and requires care in transferring from machine to machine, server to server.

Because of scale concerns, not everyone has the technical know-how to query big data in an efficient manner, which causes database admins to limit the number of people who has access in the first place.

Because of the high amount of effort required, whatever insights that are mined from big data tend to remain a one-off exercise, making it difficult for team members to conduct follow-up analyses and validation.

All of these factors limit agility of analysis and ability to collaborate.

So What Role Does Big Data Play?

So, given all of these limitations, is user-level data worth spending time on? Absolutely — its potential to transform marketing is nothing short of incredible, both for insight generation as well as execution.

But when it comes to marketing analytics, I am a big proponent of picking the lowest-hanging fruit first: prioritizing analyses with the fastest time to insight and largest potential value. Analyses of user-level data falls squarely in the high-effort and slow-delivery camp, with variable and difficult-to-predict value.

Big data may have the potential to yield more insights than smaller data, but it will take much more time, consideration, and technical ability in order to extract them. Meanwhile, there should be plenty of room to gain learnings and improve campaign results using less granular data. I have yet to see such a thing as a perfectly managed account, or a perfectly executed campaign.

So yes, definitely start investing in big data capabilities. Meanwhile, let’s focus as much if not more in maximizing value from smaller data.

Note: In this article I treated “big data” and “user-level data” synonymously for simplicity’s sake, but the definition of big data can extend to less granular but more complex and varied data sets.

Originally posted via “7 Limitations Of Big Data In Marketing Analytics”


 

Originally Posted at: 7 Limitations Of Big Data In Marketing Analytics by analyticsweekpick

Jul 26, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Tour of Accounting  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> 66 job interview questions for data scientists by analyticsweekpick

>> @JohnNives on ways to demystify AI for enterprise #FutureOfData by admin

>> The Value of Enterprise Feedback Management Vendors by bobehayes

Wanna write? Click Here

[ NEWS BYTES]

>>
 7 Things Lawyers Should Know About Artificial Intelligence – Above the Law Under  Artificial Intelligence

>>
 Cyber security shorts: Daniel Schatz, Perform Group – IBC365 Under  cyber security

>>
 Cloud is not a commodity – avoiding the ‘Cloud Trap’ – ITProPortal Under  Cloud

More NEWS ? Click Here

[ FEATURED COURSE]

R Basics – R Programming Language Introduction

image

Learn the essentials of R Programming – R Beginner Level!… more

[ FEATURED READ]

Big Data: A Revolution That Will Transform How We Live, Work, and Think

image

“Illuminating and very timely . . . a fascinating — and sometimes alarming — survey of big data’s growing effect on just about everything: business, government, science and medicine, privacy, and even on the way we think… more

[ TIPS & TRICKS OF THE WEEK]

Winter is coming, warm your Analytics Club
Yes and yes! As we are heading into winter what better way but to talk about our increasing dependence on data analytics to help with our decision making. Data and analytics driven decision making is rapidly sneaking its way into our core corporate DNA and we are not churning practice ground to test those models fast enough. Such snugly looking models have hidden nails which could induce unchartered pain if go unchecked. This is the right time to start thinking about putting Analytics Club[Data Analytics CoE] in your work place to help Lab out the best practices and provide test environment for those models.

[ DATA SCIENCE Q&A]

Q:What are confounding variables?
A: * Extraneous variable in a statistical model that correlates directly or inversely with both the dependent and the independent variable
* A spurious relationship is a perceived relationship between an independent variable and a dependent variable that has been estimated incorrectly
* The estimate fails to account for the confounding factor

Source

[ VIDEO OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Eloy Sasot, News Corp

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Eloy Sasot, News Corp

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

The most valuable commodity I know of is information. – Gordon Gekko

[ PODCAST OF THE WEEK]

@TimothyChou on World of #IOT & Its #Future Part 1 #FutureOfData #Podcast

 @TimothyChou on World of #IOT & Its #Future Part 1 #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

140,000 to 190,000. Too few people with deep analytical skills to fill the demand of Big Data jobs in the U.S. by 2018.

Sourced from: Analytics.CLUB #WEB Newsletter