Sep 27, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Complex data  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Practical Tips for Running a PURE Evaluation by analyticsweek

>> Big Data Explained in Less Than 2 Minutes – To Absolutely Anyone by analyticsweekpick

>> 20 Best Practices in Customer Feedback Programs: Building a Customer-Centric Company by bobehayes

Wanna write? Click Here

[ NEWS BYTES]

>>
 DENAVE launches DENSALES – an end-to-end sales force automation solution – Express Computer (press release) (blog) Under  Sales Analytics

>>
 Hyperconverged Startup Cohesity Hits $200M Annual Sales Pace – Data Center Knowledge Under  Data Center

>>
 Like Magic: Seamless Customer Care In An IoT-Connected World – Forbes Under  Customer Experience

More NEWS ? Click Here

[ FEATURED COURSE]

CS229 – Machine Learning

image

This course provides a broad introduction to machine learning and statistical pattern recognition. … more

[ FEATURED READ]

The Signal and the Noise: Why So Many Predictions Fail–but Some Don’t

image

People love statistics. Statistics, however, do not always love them back. The Signal and the Noise, Nate Silver’s brilliant and elegant tour of the modern science-slash-art of forecasting, shows what happens when Big Da… more

[ TIPS & TRICKS OF THE WEEK]

Winter is coming, warm your Analytics Club
Yes and yes! As we are heading into winter what better way but to talk about our increasing dependence on data analytics to help with our decision making. Data and analytics driven decision making is rapidly sneaking its way into our core corporate DNA and we are not churning practice ground to test those models fast enough. Such snugly looking models have hidden nails which could induce unchartered pain if go unchecked. This is the right time to start thinking about putting Analytics Club[Data Analytics CoE] in your work place to help Lab out the best practices and provide test environment for those models.

[ DATA SCIENCE Q&A]

Q:Compare R and Python
A: R
– Focuses on better, user friendly data analysis, statistics and graphical models
– The closer you are to statistics, data science and research, the more you might prefer R
– Statistical models can be written with only a few lines in R
– The same piece of functionality can be written in several ways in R
– Mainly used for standalone computing or analysis on individual servers
– Large number of packages, for anything!

Python
– Used by programmers that want to delve into data science
– The closer you are working in an engineering environment, the more you might prefer Python
– Coding and debugging is easier mainly because of the nice syntax
– Any piece of functionality is always written the same way in Python
– When data analysis needs to be implemented with web apps
– Good tool to implement algorithms for production use

Source

[ VIDEO OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Joe DeCosmo, @Enova

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Joe DeCosmo, @Enova

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data really powers everything that we do. – Jeff Weiner

[ PODCAST OF THE WEEK]

Discussing #InfoSec with @travturn, @hrbrmstr(@rapid7) @thebearconomist(@boozallen) @yaxa_io

 Discussing #InfoSec with @travturn, @hrbrmstr(@rapid7) @thebearconomist(@boozallen) @yaxa_io

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

More than 200bn HD movies – which would take a person 47m years to watch.

Sourced from: Analytics.CLUB #WEB Newsletter

Piwik PRO Introduces New Pricing Packages For Small and Medium Enterprises

We’re happy to announce that new pricing plans are available for Piwik PRO Marketing Suite. The changes will allow small and medium enterprises to take advantage of affordable privacy-compliant marketing tools (including Consent Manager) to meet the requirements of the GDPR.

In recent weeks, one of the most restrictive data privacy regulations the world has ever seen came into force – we’re obviously talking about GDPR.

Now every company that processes the personal data of EU citizens has to make sure that their internal processes, services and products are in line with the provisions of the new law (we wrote about it in numerous articles on our blog, be sure to check them out). Otherwise, they risk severe fines.

Among many other things, they have to collect active consents from visitors before they start processing their data.

The new rules apply not only to large corporations, but also to small and medium-sized enterprises.

When the market standard is not enough

The reason for the worry for many of them is the fact that the most popular freemium analytics software provider decided to limit their support in that matter to the bare minimum.

Although Google introduced some product updates that aim to help their clients comply with the new regulation (like data retention control and a user deletion tool), they decided that their clients (data controllers) are the ones who have to develop their own mechanism for collecting, managing, and storing consents (via opt-in) from visitors (for both Google Analytics and Google Tag Manager).

Following all these rules can be a hassle for website owners, especially small to medium enterprises with often limited resources of time and workforce.

Important note! Recent events indicate that Google could be an unreliable partner in the face of the new EU regulations. On the first day after the regulation came into force, Google was sued for violating provisions of GDPR by Max Schrems an Austrian lawyer and privacy activist. You can read about it more in this article by Verge.

How Piwik PRO can help you with the task

Luckily, there are many vendors who decided to create a tool to mediate between visitors and analytics software. Depending on the provider, it’s called Cookie Consent Manager, Cookie Widget, GDPR Consent Manager, etc.

These tools are a kind of gatekeeper that passes information about consents between individual visitors and your analytics system. That way, you make sure that the data you’re operating on has been collected in compliance with the new law.

One of the companies developing this type of product is Piwik PRO. You can read more about our solution here.

New pricing plan for small and medium enterprises

Due to the growing interest in our GDPR Consent Manager among small and medium enterprises, we decided to prepare a special offer tailored to their needs.

All companies wanting to collect data about the behavior of their website’s visitors in a privacy-compliant manner, will be able to take advantage of the new Business Plan” pricing package. The offer is intended for businesses with up 2 million monthly actions on their websites.

It includes the following products:

The combined forces of these products will help you collect all the relevant information about visitors without violating the provisions of the new law (and also other data privacy laws including Chinese Internet Law and Russian law 526-FZ).

Additionally, your data will be stored in highly secure environment:

  • ISO 27001 Certified private cloud data center
  • Fully-redundant infrastructure with 99% SLA
  • Microsoft Azure GDPR-compliant cloud infrastructure, hosted in the location of your choice: Germany, Netherlands, USA

What’s more, you’ll count on professional customer support including:

  • Email support
  • Live chat
  • User training
  • Professional Onboarding

Sound interesting? Then give it a (free) spin! All you have to do is register for 30-day free trial. Our sales representatives will contact you within 24 hours!

You also can read more about the offer on our pricing page.

REGISTER FOR A FREE TRIAL

The post Piwik PRO Introduces New Pricing Packages For Small and Medium Enterprises appeared first on Piwik PRO.

Source: Piwik PRO Introduces New Pricing Packages For Small and Medium Enterprises by analyticsweek

Big Data’s Big Deal is HUMAN TOUCH, not Technology

Big Data Big Deal - Human Touch

THE MYTH:

I have been involved in marketing analytics work for some years now. It requires me to regularly talk to CXOs about their big data challenges, and their plan to leverage this data to improve business decision making. I am constantly surprised how much misconception exists among executives. All of them read about new technologies and platforms coming out of Silicon Valley that magically clean, organize, analyze and visualize data for them. As if, they just have to implement some technology, press a button, and insights would start flowing.

This is a myth. There is no such (magical) technology-based analysis. Period.

Big Data’s big deal is not about technology platforms – it is rather about appropriate human interface with data technology.

I am myself guilty of selling big data solutions under the facade of technology and platforms. In many ways, I have contributed to this misconception about Big Data technology. So, I hope you believe me when I tell you – Big Data’s big deal is not about technology platforms – it is rather about appropriate human interface with data technology. Let’s not continue to speculate that technology platforms would save the enterprise from all data problems. I have seen the most advanced technology platforms that exist today. There is only one thing I know – these platforms would serve no purpose if we don’t have trained data professionals who know three basic things – business/domain knowledge, analytical experience, and ability to embrace new data technology.

LET’S START WITH DATA DELUGE (WHICH, BTW, IS NOT THE PROBLEM):

We all know – there is data everywhere. In the past couple of years, the world has generated more data than the prior civilization put together. Whether it is content posted on web and social media, data transmitted from sensors in cars, appliances, buildings and airplanes, or streamed to your mobile, television or computers, we are surrounded and overwhelmed by data. Advancements in technology are the main driver of this data deluge, but similar advancements have taken place in the technology to collect and store data. This has made it economical for organizations to build infrastructure to store and manage large sets of data. But, the real problem is deriving value out of this data and making it useful. This is where most of the stagnation is today. According to International Data Corporation (IDC), only one percent of the digital data generated is currently being analyzed.

THE DATA REVOLUTION IS ABOUT INSIGHTS:

Everyone agrees there is a big data revolution happening, but it is not about the volume and scale of data being generated. The revolution is about the ability to actually do something with that data. What used to take millions of dollars to first build the infrastructure and then hire really smart and expensive individuals to analyze data, can now be done in thousands. It all comes down to using the right set of new age technologies and implementing right set of rules (read algorithms) to deliver answers that weren’t possible earlier. This is where the new age data computation and analysis shines. We have come a long way to leverage machine learning, graph analysis, predictive modeling algorithms and other techniques to uncover patterns and correlations that may not be readily apparent, but may turn out to be highly beneficial for business decision making.

There have been vast improvements in how and what type of datasets can be linked together to capture insights that aren’t possible with singular datasets. An example that everyone understands is how Amazon links together shopping and purchase history of customers to make product recommendations. Along with linking of datasets, improvements in visualization tools have made it much easier for humans to analyze data and see patterns. These technologies are now making inroads into all types of disparate use cases to solve complex problems ranging from pharmaceutical drug discovery to providing terrorism alerts.

BUT, HERE’S THE PROBLEM:

Insights can only be delivered by data scientists. And, there is a huge shortage of people who are comfortable with handling large amounts of data. Data collection is easy and cheap, and the general approach is to collect everything and worry later about relevancy and finding patterns. This can be a mistake especially with large datasets because there can be numerous possible correlations that increase the number of false positives that can surface. No matter how sophisticated technologies get, we need more data scientists outside of academia and working full-time on solving real world problems.

Machines cannot replace human beings when it comes to asking the right questions, deciding what to analyze, what data to use, identifying patterns and interpreting results for the business. Machines are good at doing fast computations and analysis, but we need data scientists to build hypothesis, design tests, and use the data to confirm the hypothesis. Traditional data scientists are not the solution, though. There are many generalists in the data science field who claim that if you throw data at them – they can deliver insights. But, here’s the reality – someone who doesn’t have knowledge of your business can only have limited (if any) impact. In addition, data scientists need to make sure decision makers are not presented with too much data because it quickly becomes useless. This is where technology and analytical experience comes in handy – techniques that help aggregate, organize, filter and cluster data are extremely important to reduce datasets to digestible chunks.

IN CONCLUSION:

Company executives need to understand that human touch plays a fundamental role in the big data journey. Insights delivered by technology without proper human interface can put their business at risk, alienate customers, and damage their brand. Given the current advancements, it comes down to putting the right technologies to use and getting the right people (who know your business) in the room to derive value out of the ‘Big Data’. Is that an easy thing to do? What has been your experience?

You can also contact me on twitter @jasmeetio or on my personal website – Jasmeet Sawhney.

Source: Big Data’s Big Deal is HUMAN TOUCH, not Technology

Sep 20, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Statistics  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Untangling Big Data for Digital Marketing by analyticsweekpick

>> Talent Analytics: Old Wine In New Bottles? by analyticsweekpick

>> Future of Public Sector and Jobs in #BigData World #FutureOfData #Podcast by v1shal

Wanna write? Click Here

[ NEWS BYTES]

>>
 Artificial intelligence: how much is hype and how much is reality? – Networks Asia Under  Artificial Intelligence

>>
 New Intel chip flaws: ‘Foreshadow’ could wreak havoc on the cloud – The Mercury News Under  Cloud

>>
 Is Your Healthcare Organization Prepared to Withstand a Data Security Breach? – Security Intelligence (blog) Under  Data Security

More NEWS ? Click Here

[ FEATURED COURSE]

Artificial Intelligence

image

This course includes interactive demonstrations which are intended to stimulate interest and to help students gain intuition about how artificial intelligence methods work under a variety of circumstances…. more

[ FEATURED READ]

Python for Data Analysis: Data Wrangling with Pandas, NumPy, and IPython

image

Python for Data Analysis is concerned with the nuts and bolts of manipulating, processing, cleaning, and crunching data in Python. It is also a practical, modern introduction to scientific computing in Python, tailored f… more

[ TIPS & TRICKS OF THE WEEK]

Keeping Biases Checked during the last mile of decision making
Today a data driven leader, a data scientist or a data driven expert is always put to test by helping his team solve a problem using his skills and expertise. Believe it or not but a part of that decision tree is derived from the intuition that adds a bias in our judgement that makes the suggestions tainted. Most skilled professionals do understand and handle the biases well, but in few cases, we give into tiny traps and could find ourselves trapped in those biases which impairs the judgement. So, it is important that we keep the intuition bias in check when working on a data problem.

[ DATA SCIENCE Q&A]

Q:Explain what a false positive and a false negative are. Why is it important these from each other? Provide examples when false positives are more important than false negatives, false negatives are more important than false positives and when these two types of errors are equally important
A: * False positive
Improperly reporting the presence of a condition when it’s not in reality. Example: HIV positive test when the patient is actually HIV negative

* False negative
Improperly reporting the absence of a condition when in reality it’s the case. Example: not detecting a disease when the patient has this disease.

When false positives are more important than false negatives:
– In a non-contagious disease, where treatment delay doesn’t have any long-term consequences but the treatment itself is grueling
– HIV test: psychological impact

When false negatives are more important than false positives:
– If early treatment is important for good outcomes
– In quality control: a defective item passes through the cracks!
– Software testing: a test to catch a virus has failed

Source

[ VIDEO OF THE WEEK]

Understanding #BigData #BigOpportunity in Big HR by @MarcRind #FutureOfData #Podcast

 Understanding #BigData #BigOpportunity in Big HR by @MarcRind #FutureOfData #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

It is a capital mistake to theorize before one has data. Insensibly, one begins to twist the facts to suit theories, instead of theories to

[ PODCAST OF THE WEEK]

Solving #FutureOfWork with #Detonate mindset (by @steven_goldbach & @geofftuff) #JobsOfFuture #Podcast

 Solving #FutureOfWork with #Detonate mindset (by @steven_goldbach & @geofftuff) #JobsOfFuture #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

By then, our accumulated digital universe of data will grow from 4.4 zettabyets today to around 44 zettabytes, or 44 trillion gigabytes.

Sourced from: Analytics.CLUB #WEB Newsletter

Entrepreneur America’s Wish-List For The President

Entrepreneur America's Wish-List For The President

Presidential campaign and election has ended with all cheers and happiness for Obama administration and supporters. Now, comes a huge task, as claimed by both candidates, an effort to re-build America. We all know why America is called a land of opportunity and has a distinct edge that separates it from the rest of the world. It is the entrepreneurial culture that gives everyone a hope and opportunity to see ones dream come true. But with current economic turmoil, political instability and lack of focus, America is losing its luster as an entrepreneurial powerhouse of the world. We, all entrepreneurial immigrants have heard one or more stories about the brain drain happening in America. Several of my close entrepreneurial friends moved back to their native land to pursue their dream. American startups accounts for majority of jobs in the country, and yet political system is gradually ignoring its role and making America lose its edge as an entrepreneurial world. Many countries are exploiting this situation and providing great haven for entrepreneurs. So, it is a plea from entrepreneurs of America to the president to help bring America back to the map as world’s top place to build a business and create innovation. Focusing on following five things would go a long way in assuring American Entrepreneurs a successful run.

Immigration Reform:

We all are aware of America’s growing immigration problem and how it is impacting its positioning in the world. With overwhelming Hispanic voters going for Obama sent a clear message to both parties. Immigration reform should be a primary focus on your agenda to win an election. For math nerds: Hispanics made up 10 percent of the total vote and gave Obama almost three votes for every one earned by Romney. Obama may even have won a majority among Florida’s Cuban voters, who were once a Republican mainstay. For aspiring entrepreneurs stuck in immigration debacle, it is a truly agonizing battle. Consider crème-de-la-crème belt of immigrants who graduate from top universities, with top degrees and have to wait for 5 years before getting a permanent residency. What would this forced 5 year wait do? Demotivate, almost kill entrepreneurial spirit within and make you a worker from an inventor. Imagine having to wait 5 years by sticking to a sponsor company for your H1b. So, even if you have an entrepreneurial bug, you have no other way but to join some corporate job that is willing to sponsor your H1b, work your way through that company till you get your green card, then you can call the quits and get cranking on your entrepreneurial adventure. Is anything wrong with this logic? No sane and capable brain could afford wasting 5 primary years of their life to political loophole. This is the solo reason for many entrepreneurs to leave the country to other favorable avenues to pursue their dream. America is also facing a huge shortage in talent pool that could prepare America for tomorrow. Immigration reform will open pathways for this much-awaited talent to enter and disrupt startup world in US of A. So, Mr. Obama as we all immigrant entrepreneurs are angry and annoyed by the delay in immigration reform, we hope that it will make its way in this term helping America re-claiming its crown, when it comes to entrepreneurship.

Capital Access:

Want to invest in future? What better way to invest other than investing in American entrepreneurs, helping solve the problems of tomorrow. Why pay banks billions of dollars to bailout, that is not going to solve the economic crises. Using that money to crank innovation engine will go a long way by producing more business opportunities, more businesses and therefore more employment. Remember more than 80% of workforce is employed by small businesses. So, it is important to invest in small businesses and startups. President Obama should use his 2nd term to create more opportunities for American small businesses. As we have seen some much needed love from President Obama’s administration during 1st term towards U.S. Small Business Administration by allowing them bigger wallet for their loan size on their largest lending program. More federal programs should be introduced to help entrepreneurs in a similar manner, by providing funds, grants and contracts commercializing products and taking their business to next level. Other measures that are seeking the day of the light are the proposal to reduce taxes on startup expenses, extend a policy that allows businesses to deduct the cost of equipment and software, and expand government small business investment program by $1 billion. Some tax breaks will also go a long way in providing help with operational expenses for startups. So, capital access to startups is another big agenda seeking President’s attention.

Flexible and friendly regulations:

Regulations are another key area biting hard and making it almost impossible for any business to run smoothly. The Atlantic, and Yahoo! Finance jointly published an in-depth article:  “What Kills Small Business?  Let’s Ask Them. The article states that “69 percent of small business owners and managers say that complicated government regulations are ‘major impediments; to the creation of new jobs.” Entrepreneurs, startups and small businesses always need flexible and friendly regulations. Last 4 years had been swiped by too much bipartisan deadlocks, causing delays in every aspect of regulation easing on laws that help small businesses. Luckily, President Obama is a strong believer in investing in small businesses. His 1st term, did show some signs as Jumpstart Our Business Startup Act  was approved last April. This term should be used to promote and execute JOBS Act. JOBS Act helps small businesses gain access to financing with investor protection.  JOBS act also raises curtain on “Crowd-funding”, which is going to be another huge area to channelize liquidity to small businesses, critical to startup success. This law also creates an IPO on-ramp that gives young companies temporary relief from certain SEC regulations, making it easier and more feasible to go public. So, these 4 years should be focused in making sure that those programs see light of the day and help businesses get fewer regulations for growth.  Effort should be made to make possible the rollout for similar acts as soon as possible, so that small businesses and startups can make use of them and benefit from them sooner and grow rapidly without worrying about regulations coming in the way.

Boosting and fixing exports and trade:

This presidential election had us all listening to how unfair China gets with its trade policy. How big corporations take advantage of tax loopholes and take outside refuge to claim tax exemption, and report less revenue, costing America billions in taxes at the same time giving big companies and foreign corporations an unfair advantage. To make it fair for small businesses to survive, it is important for Obama administration to fix those trade loopholes and give everyone a fair chance at winning the customer. So, as promised, Obama administration should crack down on those laws giving undue advantage to a few. As per Obama campaign spokesman Adam Fetcher, looking ahead, the president “will make sure that no foreign company has an unfair advantage over American companies by offering more generous financing through the Export-Import Bank,”. The Export-Import Bank supported a record $6 billion in financing and insurance for small businesses last year, and more than 85 percent of the bank’s transactions were for small businesses. In 2010, President Obama also set a goal to double exports over five years and create the National Export Initiative to promote U.S. goods and remove trade barriers, expand access to credit, and promote strong growth worldwide. This initiative will also help local small businesses get the international exposure and an opportunity to expand. So, export and trade law fixes and boost will facilitate entrepreneurs gain competitive edge and provide an opportunity to expand to foreign markets. The president is also due on negotiating a new, expanded Trans-Pacific Partnership, a trade pact with nations including Australia, Singapore and Mexico, among others. So, fixing import-export issues will also go a long way in establishing a fair competition and hence healthy business.

Energize Startup/Small businesses initiative:

Obama administration should help in every possible way to promote startup. Provide as much opportunity, assistance, funding, work as possible. During Obama administration’s 1st term, $300 billion in federal-prime contracts have been awarded to small businesses since 2009, and one-third of all federal-contracting dollars in the Recovery Act went to small businesses. President Obama has further emphasized his interest for an infrastructure bill that could boost construction and engineering jobs across the country. The president plans “to create an economy built to last that relies on small-business growth, investments in entrepreneurship, advanced manufacturing and increased exporting,” Fetcher says. This is certainly a fine strategy, as is the Kauffman Foundation’s work on “Startup Act”. Startup Act proposes softer immigration laws for entrepreneurs, improving channels for early stage financing, improve broken patent system and to clear its backlog and remove barriers to the formation and growth of businesses through the introduction of automatic ten-year sunsets for all major rules. Startup America is another great initiative started to help build a community of collaborative learning within startups. More such efforts are needed from Administration to assist startup. Many private companies have also jumped into the pool seeing Obama administration do it. Mr. President, please build more such programs that helps entrepreneurs connect, collaborate and huddle to success.

So, notably 5 primary things that will unleash the entrepreneur’s ability to build world’s most disruptive startups and promote job growth is needed.  Mr President, this is a humble plea from an immigrant entrepreneur who is trying to make a difference. With your help, we all could flourish, taking America with us to a path of prosperity and success. By the way, congratulations Mr. President, you were a good president, hopefully this term will establish you as the President that got America to its Innovation 2.0 map and saved the world.

Originally Posted at: Entrepreneur America’s Wish-List For The President by v1shal

5 tips to becoming a big data superhero

superhero_businessman

Who’s the most powerful superhero?

Rishi Sikka, MD has a favorite and it’s one most people have probably never even heard of: Waverider.

Sikka, senior vice president of clinical transformation at Advocate Healthcare, considers Waverider the most powerful superhero because he can surf the time stream and predict the future.

Leading up to his presentation here at the Healthcare IT News Big Data and Healthcare Analytics Forum in New York, Sikka looked up the word “hero” and found that it has existed for millennia — it was even used prior to tongues we can trace — and the root concept is “to protect.”

Based on that definition from the Latin, and with a focus on population health management in mind, Sikka shared a fistful of tips about becoming a big data superhero.

1. Your power is looking to the future but your strength lies in the past and present. So healthcare professionals and organizations must assemble the data necessary to understand your current state of being, including knowing as much as possible about patients.

2. Pick your battles wisely. “All the great superheroes know when it’s time to move on,” Sikka said, pointing to the need for risk stratification and strategic resource allocation, which is “where big data and population health intersect.”

3. Your enemy has a name – and it’s regression to the mean. “I know it’s not very sexy,” Sikka said of his description of that enemy. He recommended that healthcare organizations consider the impactability of what they are doing, or focusing on where they can have the biggest impact. “I hope impactability will become a buzzword in the next year or two.”

4. Your superhero name is not … Cassandra. “It’s a lovely name,” Sikka explained, “just don’t pick it as a superhero name.” Why not? In Greek mythology, Cassandra, daughter of Apollo and a mortal mother, could predict the future. That was the blessing. The curse: Nobody believed her. “We don’t want population health to be an academic exercise.”

5. Don’t forget your mission. Every superhero is out fighting the bad guys, saving humanity, but sometimes even they can forget why they’re on this earth. “When we talk about population health we talk a lot about cost. We talk about bending the cost curve,” he added, “but I don’t know a single person on the front lines of care who gets jazzed up to bend the cost curve. The work revolves fundamentally around health.” Sikka suggested that healthcare professionals work to steer the dialogue back to clinical outcomes and wellness.

Sikka wound back around to the root of the word hero: “Our goal with respect to analytics, big data, population health,” he said, “is to protect, aid, support, those who give and receive care.”

To read the original article on Healthcare IT News, click here.

Originally Posted at: 5 tips to becoming a big data superhero

Sep 13, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Data security  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> How to Use MLflow, TensorFlow, and Keras with PyCharm by analyticsweek

>> The Upper Echelons of Cognitive Computing: Deriving Business Value from Speech Recognition by jelaniharper

>> 20 Best Practices for Customer Feedback Programs: Business Process Integration by bobehayes

Wanna write? Click Here

[ NEWS BYTES]

>>
 Senior Analyst, Marketing Analytics – Built In Chicago Under  Marketing Analytics

>>
 Global Financial Analytics Market 2017-2026 By Raw Materials, Manufacturing Expenses And Process Analysis – DailyHover Under  Financial Analytics

>>
 5 Tactics That Separate Analytics Leaders From Followers – Forbes Under  Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

CS109 Data Science

image

Learning from data in order to gain useful predictions and insights. This course introduces methods for five key facets of an investigation: data wrangling, cleaning, and sampling to get a suitable data set; data managem… more

[ FEATURED READ]

Superintelligence: Paths, Dangers, Strategies

image

The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. Other animals have stronger muscles or sharper claws, but … more

[ TIPS & TRICKS OF THE WEEK]

Winter is coming, warm your Analytics Club
Yes and yes! As we are heading into winter what better way but to talk about our increasing dependence on data analytics to help with our decision making. Data and analytics driven decision making is rapidly sneaking its way into our core corporate DNA and we are not churning practice ground to test those models fast enough. Such snugly looking models have hidden nails which could induce unchartered pain if go unchecked. This is the right time to start thinking about putting Analytics Club[Data Analytics CoE] in your work place to help Lab out the best practices and provide test environment for those models.

[ DATA SCIENCE Q&A]

Q:Explain selection bias (with regard to a dataset, not variable selection). Why is it important? How can data management procedures such as missing data handling make it worse?
A: * Selection of individuals, groups or data for analysis in such a way that proper randomization is not achieved
Types:
– Sampling bias: systematic error due to a non-random sample of a population causing some members to be less likely to be included than others
– Time interval: a trial may terminated early at an extreme value (ethical reasons), but the extreme value is likely to be reached by the variable with the largest variance, even if all the variables have similar means
– Data: “cherry picking”, when specific subsets of the data are chosen to support a conclusion (citing examples of plane crashes as evidence of airline flight being unsafe, while the far more common example of flights that complete safely)
– Studies: performing experiments and reporting only the most favorable results
– Can lead to unaccurate or even erroneous conclusions
– Statistical methods can generally not overcome it

Why data handling make it worse?
– Example: individuals who know or suspect that they are HIV positive are less likely to participate in HIV surveys
– Missing data handling will increase this effect as it’s based on most HIV negative
-Prevalence estimates will be unaccurate

Source

[ VIDEO OF THE WEEK]

#GlobalBusiness at the speed of The #BigAnalytics

 #GlobalBusiness at the speed of The #BigAnalytics

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

I keep saying that the sexy job in the next 10 years will be statisticians. And I’m not kidding. – Hal Varian

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Nathaniel Lin (@analytics123), @NFPA

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Nathaniel Lin (@analytics123), @NFPA

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

More than 200bn HD movies – which would take a person 47m years to watch.

Sourced from: Analytics.CLUB #WEB Newsletter

Big universe, big data, astronomical opportunity

30 Oct 2010 --- Open cluster Messier 39 in the constellation Cygnus. --- Image by © Alan Dyer/Stocktrek Images/Corbis
30 Oct 2010 — Open cluster Messier 39 in the constellation Cygnus. — Image by © Alan Dyer/Stocktrek Images/Corbis

Astronomical data is and has always been “big data”. Once that was only true metaphorically, now it is true in all senses. We acquire it far more rapidly than the rate at which we can process, analyse and exploit it. This means we are creating a vast global repository that may already hold answers to some of the fundamental questions of the Universe we are seeking.

Does this mean we should cancel our up-coming missions and telescopes – after all why continue to order food when the table is replete? Of course not. What it means is that, while we continue our inevitable yet budget limited advancement into the future, so we must also simultaneously do justice to the data we have already acquired.

In a small way we already doing this. Consider citizen science, where public participation in the analysis of archived data increases the possibility of real scientific discovery. It’s a natural evolution, giving those with spare time on their hands the chance to advance scientific knowledge.

However, soon this will not be sufficient. What we need is a new breed of professional astronomy data-miners eager to get their hands dirty with “old” data, with the capacity to exploit more readily the results and findings.

Thus far, human ingenuity, and current technology have ensured that data storage capabilities have kept pace with the massive output of the electronic stargazers. The real struggle is now figuring out how to search and synthesize that output.

The greatest challenges for tackling large astronomical data sets are:

Visualisation of astronomical datasets
Creation and utilisation of efficient algorithms for processing large datasets.
The efficient development of, and interaction with, large databases.
The use of “machine learning” methodologies
The challenges unique to astronomical data are borne out of the characteristics of big data. The three Vs: volume – amount of data, variety – complexity of data and the sources that it is gathered from and velocity – rate of data and information flow. It is a problem that is getting worse.

In 2004, the data I used for my Masters had been acquired in the mid-1990s by the United Kingdom Infra-Red Telescope (UKIRT), Hawaii. In total it amounted a few 10s of Gigabytes.

Moving onward just a matter of months to my PhD, I was studying data taken from one the most successful ground based surveys in the history of astronomy, the Sloan Digital Sky Survey (SDSS). The volume of data I was having to cope with was orders of magnitude more.

SDSS entered routine operations in 2000. At the time of Data Release 12 (DR12) in July 2014 the total volume of that release was 116TB. Even this pales next to the Large Synoptic Survey Telescope (LSST). Planned to enter operation in 2022, it is aiming to gather 30TB a night.

To make progress with this massive data set, astronomy must embrace a new era of data-mining techniques and technologies. These include the application of artificial intelligence, machine learning, statistics, and database systems, to extract information from a data set and transform it into an understandable structure for further use.

Now while many scientists find themselves focused on solving these issues, let’s just pull back a moment and ask the tough questions. For what purpose are we gathering all this new data? What value do we gain from just collecting it? For that matter, have we learned all that we can from the data that we have?

It seems that the original science of data, astronomy, has a lot to learn from the new kid on the block, data science. Think about it. What if, as we strive to acquire and process more photons from across the farther reaches of the universe, from ever more exotic sources with even more complex instrumentation, that somewhere in a dusty server on Earth, the answers are already here, if we would just only pick up that dataset and look at it … possibly for the first time.

Dr Maya Dillon is the community manager for Pivigo. The company supports analytical PhDs making the transition into the world of Data Science and also runs S2DS: Europe’s largest data science boot-camp.

To read the original article on The Guardian, click here.

Source: Big universe, big data, astronomical opportunity by analyticsweekpick

Sep 06, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Big Data knows everything  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Google Cloud security updates for SEO before 2018 GDPR to change business data interactions! by thomassujain

>> Jul 27, 17: #AnalyticsClub #Newsletter (Events, Tips, News & more..) by admin

>> Seven ways predictive analytics can improve healthcare by analyticsweekpick

Wanna write? Click Here

[ NEWS BYTES]

>>
 UI data science degree sees rise in enrollment as chance of employment soars – Daily Illini Under  Data Science

>>
 Know What will be the Future Scenario of Hadoop-as-a-Service Market – Truthful Observer Under  Hadoop

>>
 IoT Time Podcast S.3 Ep.24 Smart City of San Antonio – IoT Evolution World (blog) Under  IOT

More NEWS ? Click Here

[ FEATURED COURSE]

Data Mining

image

Data that has relevance for managerial decisions is accumulating at an incredible rate due to a host of technological advances. Electronic data capture has become inexpensive and ubiquitous as a by-product of innovations… more

[ FEATURED READ]

Thinking, Fast and Slow

image

Drawing on decades of research in psychology that resulted in a Nobel Prize in Economic Sciences, Daniel Kahneman takes readers on an exploration of what influences thought example by example, sometimes with unlikely wor… more

[ TIPS & TRICKS OF THE WEEK]

Keeping Biases Checked during the last mile of decision making
Today a data driven leader, a data scientist or a data driven expert is always put to test by helping his team solve a problem using his skills and expertise. Believe it or not but a part of that decision tree is derived from the intuition that adds a bias in our judgement that makes the suggestions tainted. Most skilled professionals do understand and handle the biases well, but in few cases, we give into tiny traps and could find ourselves trapped in those biases which impairs the judgement. So, it is important that we keep the intuition bias in check when working on a data problem.

[ DATA SCIENCE Q&A]

Q:When would you use random forests Vs SVM and why?
A: * In a case of a multi-class classification problem: SVM will require one-against-all method (memory intensive)
* If one needs to know the variable importance (random forests can perform it as well)
* If one needs to get a model fast (SVM is long to tune, need to choose the appropriate kernel and its parameters, for instance sigma and epsilon)
* In a semi-supervised learning context (random forest and dissimilarity measure): SVM can work only in a supervised learning mode

Source

[ VIDEO OF THE WEEK]

@AnalyticsWeek: Big Data Health Informatics for the 21st Century: Gil Alterovitz

 @AnalyticsWeek: Big Data Health Informatics for the 21st Century: Gil Alterovitz

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Processed data is information. Processed information is knowledge Processed knowledge is Wisdom. – Ankala V. Subbarao

[ PODCAST OF THE WEEK]

@BrianHaugli @The_Hanover ?on Building a #Leadership #Security #Mindset #FutureOfData #Podcast

 @BrianHaugli @The_Hanover ?on Building a #Leadership #Security #Mindset #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

571 new websites are created every minute of the day.

Sourced from: Analytics.CLUB #WEB Newsletter

2017 Trends in the Internet of Things

The Internet of Things is lurching forward into the coming year, like never before. Its growth is manifesting rapidly, exponentially, with an increasingly broadening array of use cases and applications influencing verticals well removed from its conventional patronage in the industrial internet.

With advances throughout the public and private sectors, its sway is extending beyond retail and supply chain management to encompass facets of delivery route optimization, financial services, healthcare and the marked expansion of the telecommunication industry in the form of connected cities and connected cars.

An onset of technological approaches, some novel, some refined, will emerge in the coming year to facilitate the analytics and security functionality necessary to solidify the IoT’s impact across the data sphere with a unique blend of big data, cloud, cognitive computing and processing advancements for customized applications of this expressivity of IT.

The result will be a personalization of business opportunities and consumer services veering ever closer to laymen users.

Speed of Thought Analytics
The interminable sensor generation and streaming of data foundational to the IoT warrants a heightened analytic productivity facilitated in a variety of ways. Surmounting the typical schema constraints germane to the relational world can involve semantic technologies with naturally evolving models to accommodate time-sensitive data. Other techniques involve file formats capable of deriving schema on the fly. “Self-describing formats is the umbrella,” MapR Senior Vice President of Data and Applications Jack Norris reflected. “There are different types of files that kind of fall into that, such as JSON and Avro.” Still other approaches involve General Processing Units (GPUs), which have emerged as a preferable alternative to conventional Central Processing Units (CPUs) to enable what Kinetica VP of Global Solution Engineering Eric Mizell referred to as answering questions at “the speed of thought”—in which organizations are not limited by schema and indexing designs for the number, speed, and type of questions provisioned by analytics in real-time.

According to Mizell, GPUs are “purpose-built for repetitive tasks at parallel with thousands of cores for aggregation, mathematics, and those type of things” whereas CPUS are better for discreet, sequential operations. Analytics platforms—particularly those designed for the IoT—leveraging GPUs are not bound by schema and rigid indexing to allow for multiple questions equitable to the speed at which data is generated, especially when augmented with visualization mechanisms illustrating fluctuating data states. “You can ask questions of the data without having to have predetermined questions and find answers in human readable time,” Mizell explained. “We’re getting tremendous response from customers able to load hundreds of millions and billions of rows [and] include them in interactive time. It transforms what business can do.” These capabilities are integral to the expansion of the IoT in the telecommunications industry, as “Connected cities and connected cars are huge with a lot of the telcos,” according to Mizell.

Machine Interaction
The best means of deriving value from the IoT actually transcends analytics and necessitates creating action between connected machines. Effecting such action in real-time will increasingly come to rely on the various forms of artificial intelligence pervading throughout modern enterprises, which is most readily accessible with options for machine learning and deep learning. Furthermore, Forbes contends AI is steadily moving to the cloud, which is instrumental in making these capabilities available to all organizations—not just those armed with a slew of data scientists. Regarding the options for libraries of deep learning and machine learning algorithms available to organizations today, Mizell remarked, “We’re exposing those libraries for consumers to use on analytics and streaming. On the data streaming end we’ll be able to execute those libraries on demand to make decisions in real-time.” The most cogent use case for machine-to-machine interaction involving the IoT pertains to connected cars, autonomous vehicles, and some of the more cutting edge applications for race car drivers. These vehicles are able to account for the requisite action necessary in such time-sensitive applications by leveraging GPU-facilitated AI in real time. “For autonomous cars, the Tesla has a bank of GPUs in the trunk,” Mizell commented. “That’s how it’s able to read the road in real-time.”

Back from the Edge
Another substantial trend to impact the IoT in the coming year is the evolving nature of the cloud as it relates to remote streaming and sensor data applications. Cloud developments in previous years were focused on the need for edge computing. The coming year will likely see a greater emphasis on hybrid models combining the decentralized paradigm with the traditional centralized one. In circumstances in which organizations have real-time, remote data sources on a national scale, “You can’t respond to it fast enough if you’re piping it all the way down to your data center,” Mizell said. “You’ll have a mix of hybrid but the aggregation will come local. The rest will become global.” One of the best use cases for such hybrid cloud models for the IoT comes from the U.S. Postal Service, which Mizell mentioned is utilizing the IoT to track mail carriers, optimize their routes, and increase overall efficiency. This use case is similar to deployments in retail in which route optimization is ascertained for supply chain management and the procurement of resources. Still, the most prominent development affecting the IoT’s cloud developments could be that “all of the cloud vendors are now providing GPUs,” Mizell said. “That’s very new this year. You’ve got all the big three with a bank of GPUs at the ready.” This development is aligned with the broadening AI capabilities found in the cloud.

Software Defined Security
Implementing IoT action and analytics in a secure environment could very well represent the central issue of the viability of this technology to the enterprise. Numerous developments in security are taking place to reduce the number and magnitude of attacks on the IoT. One of the means of protecting both endpoint devices and the centralized networks upon which they are based is to utilize software defined networking, which is enjoying a resurgence of sorts due to IoT security concerns. The core of the software defined networking approach is the intelligent provisioning of resources on demand for the various concerns of a particular network. In some instances this capability includes dedicating resources for bandwidth and trafficking, in others it directly applies to security. In the latter instance the network can create routes for various devices—on-the-fly—to either connect or disconnect devices to centralized frameworks according to security protocols. “Devices are popping up left and right,” Mizell acknowledged. “If it’s an unknown device shut it down. Even if it has a username and a password, don’t give it access.” Some of the applications of the IoT certainly warrant such security measures, including financial industry forays into the realm of digital banking in which mobile devices function as ATM machines allowing users to withdraw funds from their phones and have cash delivered to them. “That’s what they say is in the works,” Mizell observed.

Endpoint Security
Security measures for the IoT are exacerbated by the involvement of endpoint devices, which typically have much less robust security than centralized frameworks do. Moreover, such devices can actually perpetuate attacks in the IoT to wreak havoc on centralized mainframes. Strengthening device security can now take the form of endpoint device registration and authorization. According to Mizell: “There’s a notion of device registration, whether it’s on the network or not. If [you] can bring your phone or whatever device to work, it detects the device by its signature, and then says it only has access to the internet. So you start locking devices into a certain channel.” Blockchain technologies can also prove influential in securing the IoT. These technologies have natural encryption techniques that are useful for this purpose. Moreover, they also utilize a decentralized framework in which the validity of an action or addendum to the blockchain (which could pertain to IoT devices in this case) is determined by effecting a consensus among those involved in it. This decentralized, consensus-based authorization could prove valuable for protecting the IoT from attacks.

Democratization
As the use cases for the IoT become more and more surprising, it is perhaps reassuring to realize that the technologies enabling them are becoming more dependable. Accessing the cognitive computing capabilities to implement machine-based action and swift analytics via the cloud is within the grasp of most organizations. The plethora of security options can strengthen IoT networks, helping to justify their investments. Hybrid cloud models use the best of both worlds for instantaneous action as well as data aggregation. Thus, the advantages of the continuous connectivity and data generation of this technology are showing significant signs of democratization.

Source: 2017 Trends in the Internet of Things by jelaniharper