Nov 14, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Data Accuracy  Source

[ AnalyticsWeek BYTES]

>> Solving Common Data Challenges by analyticsweek

>> Sisense AI – What it Really Takes to Build a Better Mousetrap by analyticsweek

>> Three Upcoming Talks on Big Data and Customer Experience Management by bobehayes

Wanna write? Click Here

[ FEATURED COURSE]

R, ggplot, and Simple Linear Regression

image

Begin to use R and ggplot while learning the basics of linear regression… more

[ FEATURED READ]

The Signal and the Noise: Why So Many Predictions Fail–but Some Don’t

image

People love statistics. Statistics, however, do not always love them back. The Signal and the Noise, Nate Silver’s brilliant and elegant tour of the modern science-slash-art of forecasting, shows what happens when Big Da… more

[ TIPS & TRICKS OF THE WEEK]

Strong business case could save your project
Like anything in corporate culture, the project is oftentimes about the business, not the technology. With data analysis, the same type of thinking goes. It’s not always about the technicality but about the business implications. Data science project success criteria should include project management success criteria as well. This will ensure smooth adoption, easy buy-ins, room for wins and co-operating stakeholders. So, a good data scientist should also possess some qualities of a good project manager.

[ DATA SCIENCE Q&A]

Q:How frequently an algorithm must be updated?
A: You want to update an algorithm when:
– You want the model to evolve as data streams through infrastructure
– The underlying data source is changing
– Example: a retail store model that remains accurate as the business grows
– Dealing with non-stationarity

Some options:
– Incremental algorithms: the model is updated every time it sees a new training example
Note: simple, you always have an up-to-date model but you can’t incorporate data to different degrees.
Sometimes mandatory: when data must be discarded once seen (privacy)
– Periodic re-training in “batch” mode: simply buffer the relevant data and update the model every-so-often
Note: more decisions and more complex implementations

How frequently?
– Is the sacrifice worth it?
– Data horizon: how quickly do you need the most recent training example to be part of your model?
– Data obsolescence: how long does it take before data is irrelevant to the model? Are some older instances
more relevant than the newer ones?
Economics: generally, newer instances are more relevant than older ones. However, data from the same month, quarter or year of the last year can be more relevant than the same periods of the current year. In a recession period: data from previous recessions can be more relevant than newer data from different economic cycles.

Source

[ VIDEO OF THE WEEK]

#BigData #BigOpportunity in Big #HR by @MarcRind #JobsOfFuture #Podcast

 #BigData #BigOpportunity in Big #HR by @MarcRind #JobsOfFuture #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data really powers everything that we do. – Jeff Weiner

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with @Beena_Ammanath, @GE

 #BigData @AnalyticsWeek #FutureOfData #Podcast with @Beena_Ammanath, @GE

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

According to execs, the influx of data is putting a strain on IT infrastructure. 55 percent of respondents reporting a slowdown of IT systems and 47 percent citing data security problems, according to a global survey from Avanade.

Sourced from: Analytics.CLUB #WEB Newsletter

Nov 07, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Fake data  Source

[ AnalyticsWeek BYTES]

>> Does the Future Lie with Embedded BI? by analyticsweek

>> How can you reap the advantages of Big Data in your enterprise? Services you can expect from a Remote DBA Expert by thomassujain

>> Voices in AI – Episode 91: A Conversation with Mazin Gilbert by analyticsweekpick

Wanna write? Click Here

[ FEATURED COURSE]

CS229 – Machine Learning

image

This course provides a broad introduction to machine learning and statistical pattern recognition. … more

[ FEATURED READ]

How to Create a Mind: The Secret of Human Thought Revealed

image

Ray Kurzweil is arguably today’s most influential—and often controversial—futurist. In How to Create a Mind, Kurzweil presents a provocative exploration of the most important project in human-machine civilization—reverse… more

[ TIPS & TRICKS OF THE WEEK]

Finding a success in your data science ? Find a mentor
Yes, most of us dont feel a need but most of us really could use one. As most of data science professionals work in their own isolations, getting an unbiased perspective is not easy. Many times, it is also not easy to understand how the data science progression is going to be. Getting a network of mentors address these issues easily, it gives data professionals an outside perspective and unbiased ally. It’s extremely important for successful data science professionals to build a mentor network and use it through their success.

[ DATA SCIENCE Q&A]

Q:What is POC (proof of concept)?
A: * A realization of a certain method to demonstrate its feasibility
* In engineering: a rough prototype of a new idea is often constructed as a proof of concept

Source

[ VIDEO OF THE WEEK]

Solving #FutureOfWork with #Detonate mindset (by @steven_goldbach & @geofftuff) #JobsOfFuture #Podcast

 Solving #FutureOfWork with #Detonate mindset (by @steven_goldbach & @geofftuff) #JobsOfFuture #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

I keep saying that the sexy job in the next 10 years will be statisticians. And I’m not kidding. – Hal Varian

[ PODCAST OF THE WEEK]

@ReshanRichards on creating a learning startup for preparing for #FutureOfWork #JobsOfFuture #Podcast

 @ReshanRichards on creating a learning startup for preparing for #FutureOfWork #JobsOfFuture #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

And one of my favourite facts: At the moment less than 0.5% of all data is ever analysed and used, just imagine the potential here.

Sourced from: Analytics.CLUB #WEB Newsletter

A Single Customer View : The Secret Weapon Everyone Must Use

Insurers have a lot of customer data across different systems peppered throughout their enterprise. Customer data would typically live in multiple systems like CRM, Billing, Policy Administration, and so on. This approach however suffers from multiple challenges:

 

  1. Duplicate data across multiple systems
  2. Multiple Versions of the same data point
  3. No Single source of truth
  4. No correlation between cause and action
  5. Completely under utilized customer interaction data

 

Imagine a structure so flexible and scalable that, that it could bring all your data sources together, irrespective of the data formats, tie in with the customer key result areas (KRAs) and at the same time deliver predictable insights at the point of decision. In real time.

 

Enter single customer view. Or as we call it – Customer OneView.

 

CRUX OneView

OneView in Action

 

Customer OneView is part of Aureus data analytics platform called CRUX. OneView is not anything like a CRM. While a CRM would show only static information, OneView delivers intelligent, usable and real time insights that can be put to use immediately.

OneView can integrate with (atleast) four broad event data streams:

  1. Customer
  2. Relationship
  3. Transactions
  4. Interaction

 

Nitin had written about stream based data integration on his insightful post titled “Cheers to Stream Based Integration“

 

These data streams could originate across multiple data systems – Policy Admin, CRM, Billing, etc.. Between them, these four cover some of the most critical customer data, that often lies under utilized. OneView not only brings these data streams together, but it also helps build a comprehensive customer life journey showing important milestones, critical customer interactions, sentiment at each interaction or transaction level as well as at a relationship level.  While OneView is a powerful insights delivery framework, it also helps to deliver the output of predictive analytics models in a form that is usable by the business users. OneView can help translate the output of the analytical models into usable insights. Imagine a customer sales representative talking to a customer, or a field sales agent going to meet a customer. OneView will give them unambiguous insight into the customers history, sentiment and even potential  action to take, without burdening them with the Hows and Whys.

 

Imagine a typical customer cross sell scenario. Most organizations tend to throw (figuratively speaking) the entire product catalog at the customer without any consideration for their lifestage needs, portfolio, demographics etc… Not only is this a highly ineffective cross sell approach, but it is a terrible customer experience approach. With OneView the customer service representative or the field service agents knows exactly what the customers latest and overall sentiment is, what her product portfolio looks like and which product the customer is most likely to buy.

 

The end goal of any activity is to make the end customers experience epic. By knowing how a customer is likely to behave, modeled on her previous behavior, insurance companies can ensure that the customer experience is always moving to the right.

 

OneView

Source: A Single Customer View : The Secret Weapon Everyone Must Use by analyticsweek

The Mainstream Adoption of Blockchain: Internal and External Enterprise Applications

The surging interest in blockchain initially pertained to its utility as the underlying architecture for the cryptocurrency phenomenon. Nonetheless, its core attributes (its distributed ledger system, immutability, and requisite permissions) are rapidly gaining credence in an assortment of verticals for numerous deployments.

Blockchain techniques are routinely used in several facets of supply chain management, insurance, and finance. In order to realize the widespread adoption rates many believe this technology is capable of, however, blockchain must enhance the very means of managing data-driven processes, similar to how applications of Artificial Intelligence are attempting to do so.

Today, there are myriad options for the enterprise to improve operations by embedding blockchain into fundamental aspects of data management. If properly architected, this technology can substantially impact facets of Master Data Management, data governance, and security. Additionally, it can provide these advantages not only between organizations, but also within them, operating as what Franz CEO Jans Aasman termed “a usability layer on top” of any number of IT systems.

Customer Domain Management
A particularly persuasive use case for the horizontal adoption of blockchain is deploying it to improve customer relations. Because blockchain essentially functions as a distributed database in which transactions between parties must be validated for approval (via a consensus approach bereft of centralized authority), it’s ideal for preserving the integrity of interactions between the enterprise and valued customers. In this respect it can “create trusted ledgers for customers that are completely invisible to the end user,” Aasman stated. An estimable example of this use case involves P2P networks, in which “people just use peer-to-peer databases that record transactions,” Aasman mentioned. “But these peer-to-peer transactions are checked by the blockchain to make sure people aren’t cheating.” Blockchain is used to manage transactions between parties in supply chains in much the same way. Blockchain aids organizations with this P2P customer use case because without it, “it’s very, very complicated for normal people to get it done,” Aasman said about traditional approaches to inter-organization ledger systems. With each party operating on a single blockchain, however, transactions become indisputable once they are sanctioned between the participants.

Internal Governance and Security
Perhaps the most distinguishable feature of the foregoing use case is the fact that in most instances, end users won’t even know they’re working with blockchain. What Aasman called an “invisible” characteristic of the blockchain ledger system is ideal for internal use to monitor employees in accordance with data governance and security procedures. Although blockchain supports internal intelligence or compliance for security and governance purposes, it’s most applicable to external transactions between organizations. In finance—just like in supply chain or in certain insurance transactions—“you could have multiple institutions that do financial transactions between each other, and each of them will have a version of that database,” Aasman explained. Those using these databases won’t necessarily realize they’re fortified by blockchain, and will simply use them as they would any other transactional system. In this case, “an accountant, a bookkeeper or a person that pays the bills won’t even know there’s a blockchain,” commented Aasman. “He will just send money or receive money, but in the background there’s blockchain making sure that no one can fool with the transactions.”

Master Data Management
Despite the fact that individual end users may be ignorant of the deployment of blockchain in the above use cases, it’s necessary for underlying IT systems to be fully aware of which clusters are part of this ledger system. According to Aasman, users will remain unaware of blockchain’s involvement “unless, of course, someone was trying to steal money, or trying to delete intermediate transactions, or deny that he sent money, or sent the same money twice. Then the system will say hey, user X has engaged in a ‘confusing’ activity.” In doing so, the system will help preserve adherence to company policies related to security or data governance issues.

Since organizations will likely employ other IT systems without blockchain, Master Data Management hubs will be important for “deciding for which transactions this applies,” Aasman said. “It’s going to be a feature of MDM.” Mastering the data from blockchain transactions with centralized MDM approaches can help align this data with others vital to a particular business domain, such as customer interactions. Aasman revealed that “the people that make master data management have to specify for which table this actually is true. Not the end users: the architects, the database people, the DBAs.” Implementing the MDM schema for which to optimize such internal applications of blockchain alongside those for additional databases and sources can quickly become complex with traditional methods, and may be simplified via smart data approaches.

Overall Value
The rapidity of blockchain’s rise will ultimately be determined by the utility the enterprise can derive from its technologies, as opposed to simply limiting its value to financial services and cryptocurrency. There are just as many telling examples of applying blockchain’s immutability to various facets of government and healthcare, or leveraging smart contracts to simplify interactions between business parties. By using this technology to better customer relations, reinforce data governance and security, and assist specific domains of MDM, organizations get a plethora of benefits from incorporating blockchain into their daily operations. The business value reaped in each of these areas could contribute to the overall adoption of this technology in both professional and private spheres of life. Moreover, it could help normalize blockchain as a commonplace technology for the contemporary enterprise.

Source: The Mainstream Adoption of Blockchain: Internal and External Enterprise Applications

Oct 31, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Data shortage  Source

[ AnalyticsWeek BYTES]

>> Serverless: A Game Changer for Data Integration by analyticsweekpick

>> Why Organizations Are Choosing Talend vs Informatica by analyticsweekpick

>> Building Big Analytics as a Sustainable Competitive Advantage by v1shal

Wanna write? Click Here

[ FEATURED COURSE]

A Course in Machine Learning

image

Machine learning is the study of algorithms that learn from data and experience. It is applied in a vast variety of application areas, from medicine to advertising, from military to pedestrian. Any area in which you need… more

[ FEATURED READ]

Data Science from Scratch: First Principles with Python

image

Data science libraries, frameworks, modules, and toolkits are great for doing data science, but they’re also a good way to dive into the discipline without actually understanding data science. In this book, you’ll learn … more

[ TIPS & TRICKS OF THE WEEK]

Analytics Strategy that is Startup Compliant
With right tools, capturing data is easy but not being able to handle data could lead to chaos. One of the most reliable startup strategy for adopting data analytics is TUM or The Ultimate Metric. This is the metric that matters the most to your startup. Some advantages of TUM: It answers the most important business question, it cleans up your goals, it inspires innovation and helps you understand the entire quantified business.

[ DATA SCIENCE Q&A]

Q:How do you test whether a new credit risk scoring model works?
A: * Test on a holdout set
* Kolmogorov-Smirnov test

Kolmogorov-Smirnov test:
– Non-parametric test
– Compare a sample with a reference probability distribution or compare two samples
– Quantifies a distance between the empirical distribution function of the sample and the cumulative distribution function of the reference distribution
– Or between the empirical distribution functions of two samples
– Null hypothesis (two-samples test): samples are drawn from the same distribution
– Can be modified as a goodness of fit test
– In our case: cumulative percentages of good, cumulative percentages of bad

Source

[ VIDEO OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with @ScottZoldi, @FICO

 #BigData @AnalyticsWeek #FutureOfData #Podcast with @ScottZoldi, @FICO

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data are becoming the new raw material of business. – Craig Mundie

[ PODCAST OF THE WEEK]

@CRGutowski from @GE_Digital on Using #Analytics to #Transform Sales #FutureOfData #Podcast

 @CRGutowski from @GE_Digital on Using #Analytics to #Transform Sales #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Within five years there will be over 50 billion smart connected devices in the world, all developed to collect, analyze and share data.

Sourced from: Analytics.CLUB #WEB Newsletter

Cloud Migrations: Big Challenges, Big Opportunities

When your organization decides to pull the trigger on a cloud migration, a lot of stuff will start happening all at once. Regardless of how long the planning process has been, once data starts being relocated, a variety of competing factors that have all been theoretical become devastatingly real: frontline business users still want to be able to run analyses while the migration is happening, your data engineers are concerned with the switch from whatever database you were using before, and the development org has its own data needs. With a comprehensive, BI-focused data strategy, you and your stakeholders will know what your ideal data model should look like once all your data is moved over. This way, as you’re managing the process and trying to keep everyone happy, you end in a stronger place when your migration is over than you were at the start, and isn’t that the goal?

BI-Focus and Your Data Infrastructure

“What does all this have to do with my data model?” you might be wondering. “And for that matter, my BI solution?”

I’m glad you asked, internet stranger. The answer is everything. Your data infrastructure underpins your data model and powers all of your business-critical IT systems. The form it takes can have immense ramifications for your organization, your product, and the new things you want to do with it (and how you want to build and expand on it and your feature offerings). Your data infrastructure is hooked into your BI solution via connectors, so it’ll work no matter where the data is stored. Picking the right data model, once all your data is in its new home, is the final piece that will allow you to get the most out of it with your BI solution. If you don’t have a BI solution, the perfect time to implement is once all your data is moved over and your model is built. This should all be part of your organization’s holistic cloud strategy, with buy-in from major partners who are handling the migration.

Cloud Migration

Picking the Right Database Model for You

So you’re giving your data a new home and maybe implementing a BI solution when it’s all done. Now, what database model is right for your company and your use case? There are a wide array of ways to organize data, depending on what you want to do with it.

One of the broadest is a conceptual model, which focuses on representing the objects that matter most to the business and the relationships between them (vs being a model of the data about those objects). This database model is designed principally for business users. Compare this to a physical model, which is all about the structure of the data. In this model, you’ll be dealing with tables, columns, relationships, and foreign keys, which distinguish the connections between the tables.

Now, let’s say you’re only focused on representing your data organization and architecture graphically, putting aside the physical usage or database management framework. In cases like these, a logical model could be the way to go. Examples of these types of databases include relational (dealing with data as tables or relations), network (putting data in the form of records), and hierarchical (which is a progressive tree-type structure, with each branch of the tree showing related records). These models all feature a high degree of standardization and cover all entities in the dataset and the relationships between them.

Got a wide array of different objects and types of data to deal with? Consider an object-oriented database model, sometimes called a “hybrid model.” These models look at their contained data as a collection of reusable software pieces, all with related features. They also consolidate tables but aren’t limited to the tables, giving you freedom when dealing with lots of varied data. You can use this kind of model for multimedia items you can’t put in a relational database or to create a hypertext database to connect to another object and sort out divergent information.

Lastly, we can’t help but mention the star schema here, which has elements arranged around a central core and looks like an asterisk. This model is great for querying informational indexes as part of a larger data pool. It’s used to dig up insights for business users, OLAP cubes, analytics apps, and ad-hoc analyses. It’s a simple, yet powerful, structure that sees a lot of usage, despite its simplicity.

Now What?

Whether you’re building awesome analytics into your app or empowering in-house users to get more out of your data, knowing what you’re doing with your data is key to maintaining the right models. Once you’ve picked your database, it’s time to pick your data model, with an eye towards what you want to do with it once it’s hooked into your BI solution.

Worried about losing customers? (Who isn’t?) A predictive churn model can help you get ahead of the curve by putting time and attention into relationships that are at risk of going sour. On the other side of the coin, predictive up- and cross-sell models can show you where you can get more money out of a customer and which ones are ripe to deepen your financial relationship.

What about your marketing efforts? A customer segmentation data model can help you understand the buying behaviors of your current customers and target groups and which marketing plays are having the desired effect. Or go beyond marketing with “next-best-action models” that take into account life events, purchasing behaviors, social media, and anything else you can get your hands on so that you can figure out what’s the next action with a given target (email, ads, phone call, etc.) to have the greatest impact. And predictive analyses aren’t just for humancentric activities—manufacturing and logistics companies can take advantage of maintenance models that can let you circumvent machine breakdowns based on historical data. Don’t get caught without a vital piece of equipment again.

Bringing It All Together with BI

Staying focused on your long-term goals is an important key to success. Whether you’re building a game-changing product or rebuilding your data model, having a firmly-defined goal makes all the difference when it comes to the success of your enterprise. If you’re already migrating your data to the cloud, then you’re at the perfect juncture to pick the right database and data models for your eventual use cases. Once these are set up, they’ll integrate seamlessly with your BI tool (and if you don’t have one yet, it’ll be the perfect time to implement one). Big moves like this represent big challenges, but also big opportunities to make lay the foundation for whatever you’re planning on building. Then you just have to build it!

Cloud Migration

Source: Cloud Migrations: Big Challenges, Big Opportunities by analyticsweek

Oct 24, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Fake data  Source

[ AnalyticsWeek BYTES]

>> The User Experience of Health Insurance Websites by analyticsweek

>> Top Reasons Why Banking & Financial Institutions Are Relying on Big Data Analytics by thomassujain

>> It’s Official! Talend to Welcome Stitch to the Family! by analyticsweekpick

Wanna write? Click Here

[ FEATURED COURSE]

Hadoop Starter Kit

image

Hadoop learning made easy and fun. Learn HDFS, MapReduce and introduction to Pig and Hive with FREE cluster access…. more

[ FEATURED READ]

Machine Learning With Random Forests And Decision Trees: A Visual Guide For Beginners

image

If you are looking for a book to help you understand how the machine learning algorithms “Random Forest” and “Decision Trees” work behind the scenes, then this is a good book for you. Those two algorithms are commonly u… more

[ TIPS & TRICKS OF THE WEEK]

Analytics Strategy that is Startup Compliant
With right tools, capturing data is easy but not being able to handle data could lead to chaos. One of the most reliable startup strategy for adopting data analytics is TUM or The Ultimate Metric. This is the metric that matters the most to your startup. Some advantages of TUM: It answers the most important business question, it cleans up your goals, it inspires innovation and helps you understand the entire quantified business.

[ DATA SCIENCE Q&A]

Q:What is the Central Limit Theorem? Explain it. Why is it important?
A: The CLT states that the arithmetic mean of a sufficiently large number of iterates of independent random variables will be approximately normally distributed regardless of the underlying distribution. i.e: the sampling distribution of the sample mean is normally distributed.
– Used in hypothesis testing
– Used for confidence intervals
– Random variables must be iid: independent and identically distributed
– Finite variance

Source

[ VIDEO OF THE WEEK]

Pascal Marmier (@pmarmier) @SwissRe discusses running data driven innovation catalyst

 Pascal Marmier (@pmarmier) @SwissRe discusses running data driven innovation catalyst

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

It’s easy to lie with statistics. It’s hard to tell the truth without statistics. – Andrejs Dunkels

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with @ScottZoldi, @FICO

 #BigData @AnalyticsWeek #FutureOfData #Podcast with @ScottZoldi, @FICO

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Poor data across businesses and the government costs the U.S. economy $3.1 trillion dollars a year.

Sourced from: Analytics.CLUB #WEB Newsletter

IBM Invests to Help Open-Source Big Data Software — and Itself

The IBM “endorsement effect” has often shaped the computer industry over the years. In 1981, when IBM entered the personal computer business, the company decisively pushed an upstart technology into the mainstream.

In 2000, the open-source operating system Linux was viewed askance in many corporations as an oddball creation and even legally risky to use, since the open-source ethos prefers sharing ideas rather than owning them. But IBM endorsed Linux and poured money and people into accelerating the adoption of the open-source operating system.

On Monday, IBM is to announce a broadly similar move in big data software. The company is placing a large investment — contributing software developers, technology and education programs — behind an open-source project for real-time data analysis, called Apache Spark.

The commitment, according to Robert Picciano, senior vice president for IBM’s data analytics business, will amount to “hundreds of millions of dollars” a year.

Photo courtesy of Pingdom via Flickr
Photo courtesy of Pingdom via Flickr

In the big data software market, much of the attention and investment so far has been focused on Apache Hadoop and the companies distributing that open-source software, including Cloudera, Hortonworks and MapR. Hadoop, put simply, is the software that makes it possible to handle and analyze vast volumes of all kinds of data. The technology came out of the pure Internet companies like Google and Yahoo, and is increasingly being used by mainstream companies, which want to do similar big data analysis in their businesses.

But if Hadoop opens the door to probing vast volumes of data, Spark promises speed. Real-time processing is essential for many applications, from analyzing sensor data streaming from machines to sales transactions on online marketplaces. The Spark technology was developed at the Algorithms, Machines and People Lab at the University of California, Berkeley. A group from the Berkeley lab founded a company two years ago, Databricks, which offers Spark software as a cloud service.

Spark, Mr. Picciano said, is crucial technology that will make it possible to “really deliver on the promise of big data.” That promise, he said, is to quickly gain insights from data to save time and costs, and to spot opportunities in fields like sales and new product development.

IBM said it will put more than 3,500 of its developers and researchers to work on Spark-related projects. It will contribute machine-learning technology to the open-source project, and embed Spark in IBM’s data analysis and commerce software. IBM will also offer Spark as a service on its programming platform for cloud software development, Bluemix. The company will open a Spark technology center in San Francisco to pursue Spark-based innovations.

And IBM plans to partner with academic and private education organizations including UC Berkeley’s AMPLab, DataCamp, Galvanize and Big Data University to teach Spark to as many as 1 million data engineers and data scientists.

Ion Stoica, the chief executive of Databricks, who is a Berkeley computer scientist on leave from the university, called the IBM move “a great validation for Spark.” He had talked to IBM people in recent months and knew they planned to back Spark, but, he added, “the magnitude is impressive.”

With its Spark initiative, analysts said, IBM wants to lend a hand to an open-source project, woo developers and strengthen its position in the fast-evolving market for big data software.

By aligning itself with a popular open-source project, IBM, they said, hopes to attract more software engineers to use its big data software tools, too. “It’s first and foremost a play for the minds — and hearts — of developers,” said Dan Vesset, an analyst at IDC.

IBM is investing in its own future as much as it is contributing to Spark. IBM needs a technology ecosystem, where it is a player and has influence, even if it does not immediately profit from it. IBM mainly makes its living selling applications, often tailored to individual companies, which address challenges in their business like marketing, customer service, supply-chain management and developing new products and services.

“IBM makes its money higher up, building solutions for customers,” said Mike Gualtieri, a analyst for Forrester Research. “That’s ultimately why this makes sense for IBM.”

To read the original article on The New York Times, click here.

Source: IBM Invests to Help Open-Source Big Data Software — and Itself

Oct 17, 19: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://events.analytics.club/tw/eventpull.php?cat=WEB): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

Warning: file_get_contents(http://news.analyticsweek.com/tw/newspull.php): failed to open stream: HTTP request failed! in /home3/vishaltao/public_html/mytao/script/includeit.php on line 15

[  COVER OF THE WEEK ]

image
Convincing  Source

[ AnalyticsWeek BYTES]

>> Dec 06, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

>> BI Implementation Insights: Clear and Easy Starting Points by analyticsweek

>> The Data Driven Road Less Traveled by d3eksha

Wanna write? Click Here

[ FEATURED COURSE]

CS109 Data Science

image

Learning from data in order to gain useful predictions and insights. This course introduces methods for five key facets of an investigation: data wrangling, cleaning, and sampling to get a suitable data set; data managem… more

[ FEATURED READ]

Antifragile: Things That Gain from Disorder

image

Antifragile is a standalone book in Nassim Nicholas Taleb’s landmark Incerto series, an investigation of opacity, luck, uncertainty, probability, human error, risk, and decision-making in a world we don’t understand. The… more

[ TIPS & TRICKS OF THE WEEK]

Keeping Biases Checked during the last mile of decision making
Today a data driven leader, a data scientist or a data driven expert is always put to test by helping his team solve a problem using his skills and expertise. Believe it or not but a part of that decision tree is derived from the intuition that adds a bias in our judgement that makes the suggestions tainted. Most skilled professionals do understand and handle the biases well, but in few cases, we give into tiny traps and could find ourselves trapped in those biases which impairs the judgement. So, it is important that we keep the intuition bias in check when working on a data problem.

[ DATA SCIENCE Q&A]

Q:Give examples of bad and good visualizations?
A: Bad visualization:
– Pie charts: difficult to make comparisons between items when area is used, especially when there are lots of items
– Color choice for classes: abundant use of red, orange and blue. Readers can think that the colors could mean good (blue) versus bad (orange and red) whereas these are just associated with a specific segment
– 3D charts: can distort perception and therefore skew data
– Using a solid line in a line chart: dashed and dotted lines can be distracting

Good visualization:
– Heat map with a single color: some colors stand out more than others, giving more weight to that data. A single color with varying shades show the intensity better
– Adding a trend line (regression line) to a scatter plot help the reader highlighting trends

Source

[ VIDEO OF THE WEEK]

@ReshanRichards on creating a learning startup for preparing for #FutureOfWork #JobsOfFuture #Podcast

 @ReshanRichards on creating a learning startup for preparing for #FutureOfWork #JobsOfFuture #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

If you can’t explain it simply, you don’t understand it well enough. – Albert Einstein

[ PODCAST OF THE WEEK]

#DataScience Approach to Reducing #Employee #Attrition

 #DataScience Approach to Reducing #Employee #Attrition

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Within five years there will be over 50 billion smart connected devices in the world, all developed to collect, analyze and share data.

Sourced from: Analytics.CLUB #WEB Newsletter

Creating Value from Analytics: The Nine Levers of Business Success

IBM just released the results of a global study on how businesses can get the most value from Big Data and analytics. They found nine areas that are critical to creating value from analytics. You can download the entire study here.

IBM Institute for Business Value surveyed 900 IT and business executives from 70 countries from June through August 2013. The 50+ survey questions were designed to help translate concepts relating to generating value from analytics into actions.

Nine Levers to Value Creation

Figure 1. Nine Levers to Value Creation from Analytics
Figure 1. Nine Levers to Value Creation from Analytics. Click image to enlarge.

The researchers identified nine levers that help organizations create value from data. They compared leaders (those who identified their organization as substantially outperforming their industry peers) with the rest of the sample. They found that the leaders (19% of the sample) implement the nine levers to a greater degree than the non-leaders. These nine levers are:

  1. Source of value: Actions and decisions that generate results. Leaders tend to focus primarily on their ability to increase revenue and less so on cost reduction.
  2. Measurement: Evaluating the impact on business outcomes. Leaders ensure they know how their analytics impact business outcomes.
  3. Platform: Integrated capabilities delivered by hardware and software. Sixty percent of Leaders have predictive analytic capabilities, as well as simulation (55%) and optimization (67%) capabilities.
  4. Culture: Availability and use of data and analytics within an organization. Leaders make more than half of their decisions based on data and analytics.
  5. Data: Structure and formality of the organization’s data governance process and the security of its data. Two-thirds of Leaders trust the quality of their data and analytics. A majority of leaders (57%) adopt enterprise-level standards, policies and practices to integrate data across the organization.
  6. Trust: Organizational confidence. Leaders demonstrate a high degree of trust between individual employees (60% between executives, 53% between business and IT executives)
  7. Sponsorship: Executive support and involvement. Leaders (56%) oversee the use of data and analytics within their own departments, guided by an enterprise-level strategy, common policies and metrics, and standardized methodologies compared to the rest (20%).
  8. Funding: Financial rigor in the analytics funding process. Nearly two-thirds of Leaders pool resources to fund analytic investments. They evaluate these investments through pilot testing, cost/benefit analysis and forecasting KPIs.
  9. Expertise: Development of and access to data management and analytic skills and capabilities. Leaders share advanced analytics subject matter experts across projects, where analytics employees have formalized roles, clearly defined career paths and experience investments to develop their skills.

The researchers state that each of the nine levers have a different impact on the organization’s ability to deliver value from the data and analytics; that is, all nine levers distinguish Leaders from the rest but each Lever impacts value creation in different ways. Enable levers need to be in place before value can be seen through the Drive and Amplify levers. The nine levers are organized into three levels:

  1. Enable: These levers form the basis for big data and analytics.
  2. Drive: These levers are needed to realize value from data and analytics; lack of sophistication within these levers will impede value creation.
  3. Amplify: These levers boost value creation.

Recommendations: Creating an Analytic Blueprint

Figure 2. Analytics Blueprint for Creating Value from Data. Click image to enlarge
Figure 2. Analytics Blueprint for Creating Value from Data. Click image to enlarge

Next, the researchers offered a blueprint on how business leaders can translate the research findings into real changes for their own businesses. This operational blueprint consists of three areas: 1) Strategy, 2) Technology and 3) Organization.

1. Strategy

Strategy is about the deliberateness with which the organization approaches analytics. Businesses need to adopt practices around Sponsorship, Source of value and Funding to instill a sense of purpose to data and analytics that connects the strategic visions to the tactical activities.

2. Technology

Technology is about the enabling capabilities and resources an organization has available to manage, process, analyze, interpret and store data. Businesses need to adopt practices around Expertise, Data and Platform to create a foundation for analytic discovery to address today’s problems while planning for future data challenges.

3. Organization

Organization is about the actions taken to use data and analytics to create value. Businesses need to adopt practices around Culture, Measurement and Trust to enable the organization to be driven by fact-based decisions.

Summary

One way businesses are trying to outperform their competitors is through the use of analytics on their treasure trove of data. The IBM researchers were able to identify the necessary ingredients to extract value from analytics. The current research supports prior research on the benefits of analytics in business:

  1. Top-performing businesses are twice as likely to use analytics to guide future strategies and guide day-to-day operations compared to their low-performing counterparts.
  2. Analytic innovators 1) use analytics primarily to increase value to the customer rather than to decrease costs/allocate resources, 2) aggregate/integrate different business data silos and look for relationships among once-disparate metric and 3) secure executive support around the use of analytics that encourage sharing of best practices and data-driven insights throughout their company.

Businesses, to extract value from analytics, need to focus on improving strategic, technological and organizational aspects on how they treat data and analytics. The research identified nine area or levers executives can use to improve the value they generate from their data.

For the interested reader, I recently provided a case study (see: The Total Customer Experience: How Oracle Builds their Business Around the Customer) that illustrates how one company uses analytical best practices to help improve the customer experience and increase customer loyalty.

————————–

TCE Total Customer Experience

 

Buy TCE: Total Customer Experience at Amazon >>

In TCE: Total Customer Experience, learn more about how you can  integrate your business data around the customer and apply a customer-centric analytics approach to gain deeper customer insights.

 

Source