Aug 03, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Productivity  Source

[ AnalyticsWeek BYTES]

>> Periodic Table Personified [image] by v1shal

>> How to Win Business using Marketing Data [infographics] by v1shal

>> October 31, 2016 Health and Biotech analytics news roundup by pstein

Wanna write? Click Here

[ NEWS BYTES]

>>
 Israeli cyber co Waterfall teams with insurance specialists – Globes Under  cyber security

>>
 Call Centers, Listen Up: 3 Steps to Define the Customer Experience at “Hello” – Customer Think Under  Customer Experience

>>
 Australian companies spending up on big data in 2017 – ChannelLife Australia Under  Big Data Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

CS109 Data Science

image

Learning from data in order to gain useful predictions and insights. This course introduces methods for five key facets of an investigation: data wrangling, cleaning, and sampling to get a suitable data set; data managem… more

[ FEATURED READ]

The Signal and the Noise: Why So Many Predictions Fail–but Some Don’t

image

People love statistics. Statistics, however, do not always love them back. The Signal and the Noise, Nate Silver’s brilliant and elegant tour of the modern science-slash-art of forecasting, shows what happens when Big Da… more

[ TIPS & TRICKS OF THE WEEK]

Finding a success in your data science ? Find a mentor
Yes, most of us dont feel a need but most of us really could use one. As most of data science professionals work in their own isolations, getting an unbiased perspective is not easy. Many times, it is also not easy to understand how the data science progression is going to be. Getting a network of mentors address these issues easily, it gives data professionals an outside perspective and unbiased ally. It’s extremely important for successful data science professionals to build a mentor network and use it through their success.

[ DATA SCIENCE Q&A]

Q:Explain the difference between “long” and “wide” format data. Why would you use one or the other?
A: * Long: one column containing the values and another column listing the context of the value Fam_id year fam_inc

* Wide: each different variable in a separate column
Fam_id fam_inc96 fam_inc97 fam_inc98

Long Vs Wide:
– Data manipulations are much easier when data is in the wide format: summarize, filter
– Program requirements

Source

[ VIDEO OF THE WEEK]

Discussing Forecasting with Brett McLaughlin (@akabret), @Akamai

 Discussing Forecasting with Brett McLaughlin (@akabret), @Akamai

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Getting information off the Internet is like taking a drink from a firehose. – Mitchell Kapor

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with @MPFlowersNYC, @enigma_data

 #BigData @AnalyticsWeek #FutureOfData #Podcast with @MPFlowersNYC, @enigma_data

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

In 2015, a staggering 1 trillion photos will be taken and billions of them will be shared online. By 2017, nearly 80% of photos will be taken on smart phones.

Sourced from: Analytics.CLUB #WEB Newsletter

From Mad Men to Math Men: Is Mass Advertising Really Dead?

The Role of Media Mix Modeling and Measurement in Integrated Marketing.

At the end of the year Ogilvy is coming out with a new book on the role of Mass Advertising in this new world of Digital Media, Omni-channel and quantitative marketing in which we now live. Forrester research speculated about a more balanced approach to marketing measurement at its recent conference. Forrester proclaimed that gone are the days of the unmeasured Mad men approach to advertising with its large spending on ad buys that largely only drove soft metrics such as Brand Impressions and Customer Consideration. The new balanced approach to ad measurement includes a more Mathematical approach where programs have hard metrics many of which are financial (Sales, ROI, Quality Customer relationships) is here to stay. The hypothesis that Forrester put forward in their talk was that marketing has almost gone too far to Quantitative Marketing and they suggested the best blend is where Marketing Research as well as quantitative and behavioral data both have a role in measuring integrated campaigns. So what does that mean for Mass Advertising you ask?

Firstly, and the ad agencies can breathe a sigh of relief, Mass Marketing is not dead but is subject to many new standards namely:

Mass will be a smaller part of the Omni-Channel Mix of activities that CMO’s can allocate their spending toward but that allocation should be guided by concrete measures and for very large buckets of spend, Media Mix or Marketing Mix Modeling can help with decision making. The last statistic that we saw from Forrester was that Digital Media spend was about to or already surpassed mass media ad spend. So CMO’s should not be surprised to see that SEM, Google AdWords, Digital/Social and Direct Marketing are a significant 50% or more of the overall investment.
CFO’s are asking CMO’s for the returns on programmatic and digital media buys. How much money does each media buy make and how do you know it’s working? Gone are the days of “always on” mass advertising that could get away with reporting back only GRP’s or brand health measures. The faster the CMO’s are on board with this shift the better they can ensure a dynamic process for marketing investments.
The willingness to turn off and match market test mass media to ensure that it is still working. Many firms need to assess whether TV, and Print works for their brand, or campaign in light of their current target markets. Many ad agencies and marketing service providers have advanced audience selection and matching tools to help with this problem. (Nielsen, Acxiom and many more) These tools typically integrate audience profiling as well as privacy protected behavior information.
The Need to run more integrated campaigns with standard offers across channels and a smarter way of connecting the call to action and the drivers to Omni-channel within each media. For example, mention that consumers can download the firm’s app in the app store in other online advertising. The Integration of channels within a campaign will require more complex measurement attribution rules as well as additional test marketing and test and learn principles.
In this post we briefly explore two of these changes namely media mix modeling and advertising measurement options. If you want more specifics, please feel free to contact us if we can be helpful at CustomerIntelligence.net.

Firstly, let’s discuss that there is always a way to measure mass advertising and that it is not true that you need to leave it turned on for all eternity to do so for example: If you want to understand does my Media buy in NYC or a matched Market like LA (Large Markets) bring in the level of sales and inquiry to other channels that I need at a certain threshold, we posit that you can always:

Conduct a simple pre and post ad campaign lift analysis to determine the level of sales and other performance metrics prior to, during and after the ad campaign has run.
Secondly, you can hold out a group of matched markets to serve as control for performance comparison against the market you are running the ad in.
Shut off the media in a market for a very brief period of time. This can allow you to compare “dark” period performance with the “on” program period with Seasonality adjustments to derive some intelligence about performance and perhaps create a performance factor or base line from which to measure going forward. Such a factor can be leveraged for future measurement without shutting off programs. This is what we call dues paying in marketing analytics. You may have to sacrifice a small amount of sales to read the results each year. This is one way to ensure measurement of mass advertising.
Finally, you can study any behavioral changes in cross sell and upsell rates of current customers who may increase their relationship because of the campaign you are running.
Another point to make is that Enterprise Marketing Automation can help with the tracking and measuring of ad campaigns. For really large Integrated Marketing Budgets we recommend, Media Mix Modeling or Marketing Mix Modeling. There are a number of firms(Market Share Partners Inc is one firm.) that provide these models and we can discuss that in future posts. The Basic Mechanics of Marketing Mix Modeling(MMM) is as follows:

MMM uses econometrics to help understand the relationship between Sales and the various marketing tactics that drive Sales

Combines a variety of marketing tactics (channels, campaigns, etc.) with large data sets of historical performance data
Regression Modeling and Statistical analysis is often performed on available data to estimate the impact of various promotional tactics on sales in order to forecast the future sets of promotional tactics
This analysis allows you to predict sales based on mathematical correlations to historical marketing drivers & market conditions
Ultimately, the model uses predictive analytics to optimize future marketing investments to drive increases in sales, profits & share
Allows you to understand ROI for each media channel, campaign and execution strategy
Which media vehicles/campaigns are most effective at driving revenue/profit and share
Shows you what your incremental sales would be at different levels of media spend
Optimal Spending Mix by Channel to generate the most sales
The model establishes a link between individual drivers and Sales
Allows you to identify a sales response curve to advertising
So the good news is … Mass Advertising is far from dead! Its effectiveness will be looked at in the broader context of integrated campaigns with an eye toward contributing hard returns such as more customers and quality relationships. In addition, Mass advertising will be looked at in the context of how it integrates with digital, for example when the firm runs the TV ad do searches for the firm’s products and brand increase and then do those searches convert to sales. The Funnel in the new Omni-channel world is still being established.

In Summary, overall Mass advertising must be understood at the segment, customer and brand level as we believe it has a role in the overall marketing mix when targeted and used in the most efficient way. A more thoughtful view of marketing efficiency is now emerging and includes such aspects as matching the TV ad in the right channel to the right audience, understanding metrics and measures as well as integration points in how mass can complement digital channels as part of an integrated Omni-Channel Strategy. Viewing Advertising as a Separate discipline from Digital Marketing is on its way to disappearing and marketers must be well versed in both online and offline as the lines will continue to blur, change and optimize. Flexibility is key. The Organizations inside companies will continue to merge to reflect this integration and to avoid siloed thinking and sub-par results.

We look forward to dialoguing and getting your thoughts and experience on these topics and understanding counterpoints and other points of view to ours.

Thanks Tony Branda

CEO CustomerIntelligence.net.

Mad Men and Math Men

Source: From Mad Men to Math Men: Is Mass Advertising Really Dead?

Better Recruiting Metrics Lead to Better Talent Analytics

metrics

According to Josh Bersin in Deloitte’s 2013 report, Talent Analytics: From Small Data to Big Data, 75% of HR leaders acknowledge analytics are important to the success of their organizations. But 51% have no formal talent analytics plans in place. Nearly 40% say they don’t have the resources to conduct sound talent analytics. Asked to rate their own workforce analytics skills, another 56% said poor.

As Bersin further noted in a recent PeopleFluent article, HR Forecast 2014, “Only 14% of the companies we studied are even starting to analyze people-related data in a statistical way and correlate it to business outcomes. The rest are still dealing with reporting, data cleaning and infrastructure challenges.”

There’s a striking gap between the large number of companies that recognize the importance of metrics and talent analytics and the smaller number that actually have the means and expertise to put them to use.

Yes, we do need to gather and maintain the right people data first, such as when and where applicants apply for jobs, and the specific skills an employee has. But data is just information captured by recruiting system or software already in place. It doesn’t tell any story.

Compare data against goals or thresholds and it turns into insight, a.k.a workforce metrics — measurements with a goal in mind, otherwise known as Key Performance Indicators (KPIs), all of which gauge quantifiable components of a company’s performance. Metrics reflect critical factors for success and help a company measure its progress towards strategic goals.

But here’s where it gets sticky. You don’t set off on a cross-country road trip until you know how to read the map.

For companies, it’s important to agree on the right business metrics – and it all starts with recruiting. Even with standard metrics for retention and attrition in place, some companies also track dozens of meaningless metrics— not tied to specific business goals, not helping to improve business outcomes.

I’ve seen recruiting organizations spend all their time in the metrics-gathering phase, and never get around to acting on the results — in industry parlance, “boiling the ocean.” You’re far better off gathering a limited number of metrics that you actually analyze and then act upon.

Today many organizations are focused on developing recruiting metrics and analytics because there’s so much data available today on candidates and internal employees (regardless of classification). Based on my own recruiting experience and that of many other recruiting leaders, here are what I consider the Top 5 Recruiting Metrics:

1. New growth vs. attrition rates. What percentage of the positions you fill are new hires vs. attrition? This shows what true growth really looks like. If you are hiring mostly due to attrition, it would indicate that selection, talent engagement, development and succession planning need attention. You can also break this metric down by division/department, by manager and more.

2. Quality of hires. More and more, the holy grail of hiring. Happily, all measurable: what individual performances look like, how long they stay, whether or not they are top performers, what competencies comprise their performance, where are they being hired from and why.

3. Sourcing. Measuring not just the what but the why of your best talent pools: job boards, social media, other companies, current employees, etc. This metric should also be applied to quality of hire: you’ll want to know where the best candidates are coming from. Also, if you want to know the percentage rate for a specific source, divide the number of source hires by the number of external hires. (For example, total Monster job board hires divided by total external hires.)

4. Effectiveness ratio. How many openings do you have versus how many you’re actually filling?  You can also measure your recruitment rate by dividing the total number of new hires per year by the total number of regular headcount reporting to work each year. Your requisitions filled percent can be tallied by dividing the total number of filled requisitions by the total number of approved requisitions.

5. Satisfaction rating. An important one, because it’s not paid much attention to when your other metrics are in good shape. Satisfaction ratings can be gleaned from surveys of candidates, new hires and current employees looking for internal mobility. While your overall metrics may be positive, it’s important to find out how people experience your hiring process.

As your business leaves behind those tedious spreadsheets and manual reports and moves into Talent Analytics, metrics are going to be what feeds those results. Consider which metrics are the most appropriate for your business — and why. And then, the real analysis can begin, and help your organization make better talent-related decisions.

Article originally appeared HERE.

Source by analyticsweekpick

Jul 27, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Data Mining  Source

[ AnalyticsWeek BYTES]

>> March 13, 2017 Health and Biotech analytics news roundup by pstein

>> Jun 01, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

>> What “Gangnam Style” could teach about branding: 5 Lessons by d3eksha

Wanna write? Click Here

[ NEWS BYTES]

>>
 Apple selloff a boon to short sellers – Times of India Under  Financial Analytics

>>
 Cloud Security Is Not an Either/Or – Security Intelligence (blog) Under  Cloud Security

>>
 Don’t develop for hybrid cloud without hybrid deployment – ComputerWeekly.com (blog) Under  Hybrid Cloud

More NEWS ? Click Here

[ FEATURED COURSE]

Process Mining: Data science in Action

image

Process mining is the missing link between model-based process analysis and data-oriented analysis techniques. Through concrete data sets and easy to use software the course provides data science knowledge that can be ap… more

[ FEATURED READ]

Data Science from Scratch: First Principles with Python

image

Data science libraries, frameworks, modules, and toolkits are great for doing data science, but they’re also a good way to dive into the discipline without actually understanding data science. In this book, you’ll learn … more

[ TIPS & TRICKS OF THE WEEK]

Analytics Strategy that is Startup Compliant
With right tools, capturing data is easy but not being able to handle data could lead to chaos. One of the most reliable startup strategy for adopting data analytics is TUM or The Ultimate Metric. This is the metric that matters the most to your startup. Some advantages of TUM: It answers the most important business question, it cleans up your goals, it inspires innovation and helps you understand the entire quantified business.

[ DATA SCIENCE Q&A]

Q:What are the drawbacks of linear model? Are you familiar with alternatives (Lasso, ridge regression)?
A: * Assumption of linearity of the errors
* Can’t be used for count outcomes, binary outcomes
* Can’t vary model flexibility: overfitting problems
* Alternatives: see question 4 about regularization

Source

[ VIDEO OF THE WEEK]

The History and Use of R

 The History and Use of R

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

The goal is to turn data into information, and information into insight. – Carly Fiorina

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Eloy Sasot, News Corp

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Eloy Sasot, News Corp

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Walmart handles more than 1 million customer transactions every hour, which is imported into databases estimated to contain more than 2.5 petabytes of data.

Sourced from: Analytics.CLUB #WEB Newsletter

4 Tips to Landing Your Dream Job in Big Data

dreamjob

It is no longer enough to apply for a data-driven job armed with an engineering degree and the ability to program. Over the past few years, big data has expanded outside of IT and business analytics and into departments as diverse as marketing, manufacturing, product development and sales.

As demand for data expertise grows across industries, the bar for what makes an acceptable, or better yet excellent, hire rises at pace. Competition is fierce, particularly among high-growth startups that tout big paychecks, pre-IPO stock options and premium perks. The most coveted data jobs require not only hard skills — like coding in Python or knowing how to stand up a Hadoop cluster — but critical soft skills such as how you present your findings to your company colleagues.

Working in the database industry for the past four years, I have spent numerous hours with candidates vying for an opportunity to join the big-data movement. Building the next great database company requires more than hiring a competent team of engineers with diverse backgrounds, skills and experiences. You have to find people who dare to innovate and embrace unorthodox approaches to work.

After interviewing a range of professionals working in the big data arena, I have found four mantras that distinguish top candidates in the process:

1. Subscribe to a multi-model mindset, not a monolithic one.

We no longer live in a one-solution world. The best solutions are often a mix of ideas and strategies, coalesced from diverse sources. Modern challenges require a distributed approach, which mirrors the distributed nature of the technology platforms emerging today.

2. Have a thesis about data and its positive impact on the world.

Know why you want to work in big data. Your future colleagues want to hear a compelling narrative about the ways in which you believe data is shaping the world, and why it matters. Use concrete examples to bolster your thesis and highlight any work you have done with data previously.

3. Hit the circuit; get the t-shirt.

Making strong interpersonal connections has always been important in business, so hanging out behind your screen all day will not cut it. Make it a point to meet tons of people in the data field. Go to conferences, meetups and other gatherings and actively, as well as respectfully, engage on social-media channels. One of the strongest indicators for success within a technology company is if you were referred in by an existing team member, so it is worth expanding your network.

4. Get your hands dirty.

Download a new software package or try a new tool regularly. New technologies are readily available online on GitHub and many vendors offer free community editions of their software. Build things. Break things. Fix things. Repeat. Post on your Github account, so your online profile is noticed too. This is a great way to show you are a self-motivated, creative problem solver with an active toolkit at your disposal.

As you enter the big data world, it is important to stay in tune with what makes you unique. When in doubt, keep your high-level goals in mind and push yourself to learn new things, meet people outside of your regular band of cohorts and embrace the twists and turns of an evolving industry. You will be a standout in no time.

See Original post HERE.

Source: 4 Tips to Landing Your Dream Job in Big Data by analyticsweekpick

The Evolution Of The Geek [Infographics]

The word geek is a slang term for odd or non-mainstream people, with different connotations ranging from “a computer expert or enthusiast” to “a person heavily interested in a hobby”, with a general pejorative meaning of “a peculiar or otherwise dislikable person, esp[ecially] one who is perceived to be overly intellectual”.[1]

Although often considered as a pejorative, the term is also often used self-referentially without malice or as a source of pride. – Wikipedia

 

Here is an infographic on “The Evolution of the Geek ” by Visually.

The Evolution of the Geek

Source: The Evolution Of The Geek [Infographics]

Jul 20, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Data security  Source

[ AnalyticsWeek BYTES]

>> Please share your thoughts about Steve Jobs by bobehayes

>> How Big Data Has Changed Finance by anum

>> 6 Customer Experience Practices of Loyalty Leaders by bobehayes

Wanna write? Click Here

[ NEWS BYTES]

>>
 ‘Internet of Things’ is vulnerable to hackers – Minneapolis Star Tribune Under  Internet Of Things

>>
 TensorFlow to Hadoop By Way of Datameer – Datanami Under  Hadoop

>>
 HR and IT combine efforts on workforce analytics – CIO Under  Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

Hadoop Starter Kit

image

Hadoop learning made easy and fun. Learn HDFS, MapReduce and introduction to Pig and Hive with FREE cluster access…. more

[ FEATURED READ]

Hypothesis Testing: A Visual Introduction To Statistical Significance

image

Statistical significance is a way of determining if an outcome occurred by random chance, or did something cause that outcome to be different than the expected baseline. Statistical significance calculations find their … more

[ TIPS & TRICKS OF THE WEEK]

Data Have Meaning
We live in a Big Data world in which everything is quantified. While the emphasis of Big Data has been focused on distinguishing the three characteristics of data (the infamous three Vs), we need to be cognizant of the fact that data have meaning. That is, the numbers in your data represent something of interest, an outcome that is important to your business. The meaning of those numbers is about the veracity of your data.

[ DATA SCIENCE Q&A]

Q:What are confounding variables?
A: * Extraneous variable in a statistical model that correlates directly or inversely with both the dependent and the independent variable
* A spurious relationship is a perceived relationship between an independent variable and a dependent variable that has been estimated incorrectly
* The estimate fails to account for the confounding factor

Source

[ VIDEO OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Dr. Nipa Basu, @DnBUS

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Dr. Nipa Basu, @DnBUS

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

He uses statistics as a drunken man uses lamp posts—for support rather than for illumination. – Andrew Lang

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData with Jon Gibs(@jonathangibs) @L2_Digital

 #BigData @AnalyticsWeek #FutureOfData with Jon Gibs(@jonathangibs) @L2_Digital

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Estimates suggest that by better integrating big data, healthcare could save as much as $300 billion a year — that’s equal to reducing costs by $1000 a year for every man, woman, and child.

Sourced from: Analytics.CLUB #WEB Newsletter

A beginners guide to data analytics

This is Part I of our three-part June 2015 print cover story on healthcare analytics. Part I focuses on the first steps of launching an analytics program. Part II focuses on intermediate strategies, and Part III focuses on the advanced stages of an analytics use.

This first part may sting a bit: To those healthcare organizations in the beginning stages of rolling out a data analytics program, chances are you’re going to do it completely and utterly wrong.

At least that’s according to Eugene Kolker, chief data scientist at Seattle Children’s Hospital, who has been working in data analysis for the past 25 years. When talking about doing the initial metrics part of it, “The majority of places, whether they’re small or large, they’re going to do it wrong,” he tells Healthcare IT News. And when you’re dealing with people’s lives, that’s hardly something to take lightly.

Kolker would much prefer that not to be the case, but from his experiences and what he’s seen transpire in the analytics arena across other industries, there’s some unfortunate implications for the healthcare beginners.

“What’s the worst that can happen if Amazon screws up (with analytics)?…It’s not life and death like in healthcare.”

 

But it doesn’t have to be this way. Careful, methodical planning can position an organization for success, he said. But there’s more than a few things you have to pay serious attention to.

First, you need to get executive buy in. Data analytics can help the organization improve performance in myriad arenas. It can save money in the changing value-based reimbursement world. Better yet, it can save lives. And, if you’re trying to meet strategic objectives, it may be a significant part of the equation there too.

As Kolker pointed out in a presentation given at the April 2015 CDO Summit in San Jose, California, data and analytics should be considered a “core service,” similar to that of finance, HR and IT.

Once you get your executive buy in, it’s on to the most important part of it all: the people. If you don’t have people who have empathy, if you don’t have a team who communicate and manage well, you can count on a failed program, said Kolker, who explained that this process took him years to finally get right. People. Process. Technology – in that order of importance.

“Usually data scientists are data scientists not because they like to work with people but because they like to work with data and computers, so it’s a very different mindset,” he said. It’s important, however, “to have those kind of people who can be compassionate,” who can do analysis without bias.

And why is that? “What’s the worst that can happen if Amazon screws up (with analytics)?” Kolker asked. “It’s not life and death like in healthcare,” where “it’s about very complex issues about very complex people. … The pressure for innovation is much much higher.”

[Part II: Clinical & business intelligence: the right stuff]

[Part III: Advanced analytics: All systems go]

When in the beginning stages of anything analytics, the aim is to start slow but not necessarily to start easy, wrote Steven Escaravage and Joachim Roski, principals at Booz Allen, in a 2014 Health Affairs piece on data analytics. Both have worked on 30 big data projects with various federal health agencies and put forth their lessons learned for those ready to take a similar path.

One of those lessons?

Make sure you get the right data that addresses the strategic healthcare problem you’re trying to measure or compare, not just the data that’s easiest to obtain.

“While this can speed up a project, the analytic results are likely to have only limited value,” they explained. “We have found that when organizations develop a ‘weighted data wish list’ and allocate their resources toward acquiring high-impact data sources as well as easy-to-acquire sources, they discover greater returns on their big data investment.”

So this may lead one to ask: What exactly is the right data? What metrics do you want? Don’t expect a clear-cut answer here, as it’s subjective by organization.

First, “you need to know the strategic goals for your business,” added Kolker. “Then you start working on them, how are you going to get data from your systems, how are you going to compare yourself outside?”

In his presentation at the CDO Summit this April, Kolker described one of Seattle Children’s data analytics projects that sought to evaluate the effectiveness of a vendor tool that predicted severe clinical deterioration, or SCD, of a child’s health versus the performance of a home-grown internal tool that had been used by the hospital since 2009.

After looking at cost, performance, development and maintenance, utility, EHR integration and algorithms, Kolker and his team found for buy versus build, using an external vendor tool was not usable for predicting SCD but that it could be tested for something else. And furthermore, the home-grown tool needed to be integrated into the EHR.

Kolker and his team have also helped develop a metric to identify medically complex patients after the hospital’s chief medical officer came to them wanting to boost outcomes for these patients. Medically complex patients typically have high readmissions and consume considerable hospital resources, and SCH wanted to improve outcomes for this group without increasing costs for the hospital.

For folks at the Nebraska Methodist Health System, utilizing a population risk management application that had a variety of metrics built in was a big help, explained Linda Burt, chief financial officer of the health system, in Healthcare IT News’ sister publication webinar this past April.

Katrina Belt

“The common ones you often hear of such as admissions per 1,000, ED visits per 1,000, high-risk high end imaging per 1,000,” she said. Using the application, the health system was able to identity that a specific cancer presented the biggest opportunity for cost alignment.

And health system CFO Katrina Belt’s advice? “We like to put a toe in the water and not do a cannon ball off the high dive,” she advised. Belt, the CFO at Baptist Health in Montgomery, Alabama, said a predictive analytics tool is sifting through various clinical and financial data to identify opportunities for improvement.

In a Healthcare Finance webinar this April, Belt said Baptist Health started by looking at its self-pay population and discovered that although its ER visits were declining, intensive care visits by patients with acute care conditions were up on upward trend.

Belt recommended starting with claims data.

“We found that with our particular analytics company, we could give them so much billing data that was complete and so much that we could glean from just the 835 and 837 file that it was a great place for us to start,” she said. Do something you can get from your billing data, Belt continued, and once you learn “to slide and dice it,” share with your physicians. “Once they see the power in it,” she said, “that’s when we started bringing in the clinical data,” such as tackling CAUTIs.

But some argue an organization shouldn’t start with an analytics platform. Rather, as Booz Allen’s Escaravage and Roski wrote, start with the problem; then go to a data scientist for help with it.

One federal health agency they worked with on an analytics project, for instance, failed to allow the data experts “free rein” to identify new patterns and insight, and instead provided generic BI reports to end users. Ultimately, the results were disappointing.

“We strongly encouraged the agency to make sure subject matter experts could have direct access to the data to develop their own queries and analytics,” Escaravage and Roski continued.  Overall, when in the beginning phases of any analytics project, one thing to keep in mind, as Kolker reinforced: “Don’t do it yourself.” If you do, “you’re going to fail,” he said. Instead, “do your homework; talk to people who did it.”

To read the complete article on Healthcar IT News, click here.

Source

3 Vendors Lead the Wave for Big Data Predictive Analytics

Enterprises have lots of solid choices for big data predictive analytics.

That’s the key takeaway from Forrester’s just released Wave for Big Data Predictive Analytics Solutions for the second quarter of 2015.

That being said, the products Forrester analysts Mike Gualtieri and Rowan Curran evaluated are quite different.

Data scientists are more likely to appreciated some, while business analysts will like others. Some were built for the cloud, others weren’t.

They all can be used to prepare data sets, develop models using both statistical and machine learning algorithms, deploy and manage predictive analytics lifecycles, and tools for data scientists, business analysts and application developers.

General Purpose

It’s important to note that there are plenty of strong predictive analytics solution providers that weren’t included in this Wave, and it’s not because their offerings aren’t any good.

Instead Forrester focused specifically on “general purpose” solutions rather than those geared toward more specific purposes like customer analytics, cross-selling, smarter logistics, e-commerce and so on. BloomReach, Qubit, Certona, Apigee and FusionOps, among others, are examples of vendors in the aforementioned categories.

The authors also noted that open source software community is driving predictive analytics into the mainstream. Developers have an abundant selection of API’s within reach that they can leverage via popular programming languages like Java, Python and Scala to prepare data and predictive models.

Not only that but, according to report, many BI platforms also offer “some predictive analytics capabilities.” Information Builders, MicroStrategy and Tibco, for example, integrate with R easily.

The “open source nature” of BI solutions like Birt, OpenText and Tibco Jaspersoft make R integration simpler.

Fractal Analytics, Opera Solutions, Teradata’s Think Big and Beyond The Art and the like also provide worthwhile solutions and were singled out as alternatives to buying software. The authors also noted that larger consulting companies like Accenture, Deloitte, Infosys and Virtuasa all have predictive analytics and/or big data practices.

In total, Forrester looked at 13 vendors: Alpine Data Labs, Alteryx, Agnoss, Dell, FICO, IBM, KNIME, Microsoft, Oracle, Predixion Software, RapidMiner, SAP and SAS.

Forrester’s selection criteria in the most general sense rates solution providers according to their Current Offering (components include: architecture, security, data, analysis, model management, usability and tooling, business applications) and Strategy (components include acquisition and pricing, ability to execute, implementation support, solution road map, and go-to-market growth rate.) Each main category carries 50 percent weight.

Leading the Wave

IBM, SAS and SAP — three tried and trusted providers — lead this Forrester Wave:.

IBM achieved perfect scores in the seven of the twelve criteria: Data, Usability and Tooling, Model Management, Ability to Execute, Implementation Support, Solution Road Map and Go-to Market Growth Rate. “With customers deriving insights from data sets with scores of thousands of features, IBM’s predictive analytics has the power to take on truly big data and emerge with critical insights,” note the report’s authors. Where does IBM fall short? Mostly in the Acquisition and Pricing category.

SAS is the granddaddy of predictive analytics and, like IBM, it achieved a perfect score many times over. It’s interesting to note that it scored highest among all vendors in Analysis. It was weighed down, however, by its strategy in areas like Go-to-Market Growth Rate and Acquisition and Pricing. This may not be as a big problem by next year, at least if Gartner was right in its most recent MQ on BI and Analytics Platforms Leaders, where it noted that SAS was aware of the drawback and was addressing the issue.

“SAP’s relentless investment in analytics pays off,” Forrester notes in its report. And as we’ve reiterated many times, the vendor’s predictive offerings include some snazzy differentiating features like analytics tools that you don’t have to be a data scientist to use, a visual tool that lets users analyze several databases at once, and for SAP Hana customers SAP’s Predictive Analytics Library (PAL) to analyze big data.

The Strong Performers

Not only does RapidMiner’s predictive analytics platform include more than 1,500 methods across all stages of the predictive analytics life cycle, but with a single click they can also be integrated into the cloud. There’s also a nifty “wisdom of the crowds” feature that Forrester singles out; it helps users sidestep mistakes made, by others, in the past and get to insights quicker. What’s the downside? Implementation support and security.

Alteryx takes the pains out of data prep, which is often the hardest and most miserable part of a data worker’s job. They also offer a tool for that helps data scientists collaborate with business users via a visual tool. Add to that an analytical apps gallery to help users share their data prep and modeling workflows with other users, and you’ve set a company up with what it needs to bring forth actionable insights. While Alteryx shines in areas like Data, Ability to Execute, and Go-to-Market Growth Rate, there’s room for improvement in Architecture and Security.

Oracle rates as a strong performer, even though it doesn’t offer a single purpose solution. Instead its Oracle SQL Developer tool includes a visual interface to allow data analysts to create analytical workflows and models, this according to Forrester. Not only that, but Oracle also takes advantage of open-source R for analysis, and has revised a number of its algorithms to take advantage of Oracle’s database architecture and Hadoop.

FICO, yes, Forrester’s talking about the credit scoring people, has taken its years of experience in actionable predictive analytics, built a solution and taken it to the Cloud where its use is frictionless and available to others. It could be a gem for data scientists who are continuously building and deploying models. FICO’s market offering has lots of room for improvement in areas like Data and Business Applications, though.

Agnoss aims to make it easier for non-data scientists to get busy with predictive analytics tools via support services and intuitive interfaces for developing predictive models. While the solution provider has focused its go-to-market offerings on decision trees until recently, it now also offers Strategy Tree capability, which helps advanced users to create complex cohorts from trees.

Alpine Data Labs offers “the most comprehensive collaboration tools of all the vendors in the Forrester Wave, and still manages to make the interface simple and familiar to users of any mainstream social media site,” wrote Gualtieri and Curran in the report. The fact that not enough people buy Alpine products seems to be the problem. It might be a matter of acquisition and pricing options, it’s here that Alpine scores lowest among all vendors in the Wave.

Dell plans to go big in the big data and predictive analytics game. It bought its way into the market when it acquired Statistica. “Statistica has a comprehensive library of analysis algorithms and modeling tools and a significant installed base,” say the authors. Dell scores second lowest among Wave vendors in architecture, so it has a lot of room for improvement there.

KNIME is the open source contender in Forrester’s Wave. And though “free” isn’t the selling point of open source, it rates; perhaps only second to the passion of its developers. “KNIME’s flexible platform is supported by a community of thousands of developers who drive the continued evolution of the platform by contributing extensions essential to the marketplace: such as prebuilt industry APIs, geospatial mapping, and decision tree ensembles,” write the researchers. KNIME competes with only Microsoft for a low score on business applications and is in last place, by itself, when it comes to architecture. It has a perfect score when it comes to data.

Make Room for Contenders

Both Microsoft and Predixion Software bring something to the market that others do not.

They seem to be buds waiting to blossom. Microsoft, for its part, has its new Azure Machine Learning offering as well as the assets of Revolution Analytics, which it recently acquired. Not only that, but the company’s market reach and deep pockets cannot be overstated. While Microsoft brought home lower scores than many of the vendors evaluated in this Wave, it’s somewhat forgivable because its big data, predictive analytics solutions may be the youngest.

Predixion Software, according to Forrester, offers a unique tool, namely (MSLM), a machine learning semantic model that packages up transformations, analysis, and scoring of data that can be deployed in.NET or Java OSGI containers. “This means that users can embed entire predictive workflows in applications,” says the report.

Plenty of Good Choices

The key takeaways from Forrester’s research indicate that more classes of users can now have access to “modern predictive power” and that predictive analytics now allow organizations to embed intelligence and insight.

The analysts, of course, suggest that you download their report, which, in fact, might be worthwhile doing. This is a rapidly evolving market and vendors are upgrading their products at a rapid clip. We know this because there’s rarely a week where a new product announcement or feature does not cross our desks.

And if it’s true that the organizations who best leverage data will win the future, then working with the right tools might be an important differentiator.

Originally posted via “3 Vendors Lead the Wave for Big Data Predictive Analytics”

 

Source: 3 Vendors Lead the Wave for Big Data Predictive Analytics

Jul 13, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Accuracy check  Source

[ NEWS BYTES]

>>
 Using Emojis to Boost Sentiment Analysis – Datanami Under  Sentiment Analysis

>>
 Software-defined secure networking is ideal for hybrid cloud security – CyberScoop Under  Hybrid Cloud

>>
 Why Big Data Wasn’t Trump’s Achilles Heel After All – Forbes Under  Big Data Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

Introduction to Apache Spark

image

Learn the fundamentals and architecture of Apache Spark, the leading cluster-computing framework among professionals…. more

[ FEATURED READ]

Introduction to Graph Theory (Dover Books on Mathematics)

image

A stimulating excursion into pure mathematics aimed at “the mathematically traumatized,” but great fun for mathematical hobbyists and serious mathematicians as well. Requiring only high school algebra as mathematical bac… more

[ TIPS & TRICKS OF THE WEEK]

Strong business case could save your project
Like anything in corporate culture, the project is oftentimes about the business, not the technology. With data analysis, the same type of thinking goes. It’s not always about the technicality but about the business implications. Data science project success criteria should include project management success criteria as well. This will ensure smooth adoption, easy buy-ins, room for wins and co-operating stakeholders. So, a good data scientist should also possess some qualities of a good project manager.

[ DATA SCIENCE Q&A]

Q:What is latent semantic indexing? What is it used for? What are the specific limitations of the method?
A: * Indexing and retrieval method that uses singular value decomposition to identify patterns in the relationships between the terms and concepts contained in an unstructured collection of text
* Based on the principle that words that are used in the same contexts tend to have similar meanings
* “Latent”: semantic associations between words is present not explicitly but only latently
* For example: two synonyms may never occur in the same passage but should nonetheless have highly associated representations

Used for:

* Learning correct word meanings
* Subject matter comprehension
* Information retrieval
* Sentiment analysis (social network analysis)

Source

[ VIDEO OF THE WEEK]

@AnalyticsWeek: Big Data at Work: Paul Sonderegger

 @AnalyticsWeek: Big Data at Work: Paul Sonderegger

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

You can use all the quantitative data you can get, but you still have to distrust it and use your own intelligence and judgment. – Alvin Tof

[ PODCAST OF THE WEEK]

#FutureOfData Podcast: Peter Morgan, CEO, Deep Learning Partnership

 #FutureOfData Podcast: Peter Morgan, CEO, Deep Learning Partnership

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Facebook users send on average 31.25 million messages and view 2.77 million videos every minute.

Sourced from: Analytics.CLUB #WEB Newsletter