Jul 19, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Complex data  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ NEWS BYTES]

>>
 Protecting data in a hybrid cloud environment | Network World – Network World Under  Hybrid Cloud

>>
 Big Four Vs of Big Data – BW Businessworld Under  Big Data

>>
 Sr. Marketing Analyst – United States – Built In Austin Under  Marketing Analytics

More NEWS ? Click Here

[ FEATURED COURSE]

CS109 Data Science

image

Learning from data in order to gain useful predictions and insights. This course introduces methods for five key facets of an investigation: data wrangling, cleaning, and sampling to get a suitable data set; data managem… more

[ FEATURED READ]

On Intelligence

image

Jeff Hawkins, the man who created the PalmPilot, Treo smart phone, and other handheld devices, has reshaped our relationship to computers. Now he stands ready to revolutionize both neuroscience and computing in one strok… more

[ TIPS & TRICKS OF THE WEEK]

Winter is coming, warm your Analytics Club
Yes and yes! As we are heading into winter what better way but to talk about our increasing dependence on data analytics to help with our decision making. Data and analytics driven decision making is rapidly sneaking its way into our core corporate DNA and we are not churning practice ground to test those models fast enough. Such snugly looking models have hidden nails which could induce unchartered pain if go unchecked. This is the right time to start thinking about putting Analytics Club[Data Analytics CoE] in your work place to help Lab out the best practices and provide test environment for those models.

[ DATA SCIENCE Q&A]

Q:Explain Tufte’s concept of ‘chart junk’?
A: All visuals elements in charts and graphs that are not necessary to comprehend the information represented, or that distract the viewer from this information

Examples of unnecessary elements include:
– Unnecessary text
– Heavy or dark grid lines
– Ornamented chart axes
– Pictures
– Background
– Unnecessary dimensions
– Elements depicted out of scale to one another
– 3-D simulations in line or bar charts

Source

[ VIDEO OF THE WEEK]

Pascal Marmier (@pmarmier) @SwissRe discusses running data driven innovation catalyst

 Pascal Marmier (@pmarmier) @SwissRe discusses running data driven innovation catalyst

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Without big data, you are blind and deaf and in the middle of a freeway. – Geoffrey Moore

[ PODCAST OF THE WEEK]

@TimothyChou on World of #IOT & Its #Future Part 1 #FutureOfData #Podcast

 @TimothyChou on World of #IOT & Its #Future Part 1 #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

A quarter of decision-makers surveyed predict that data volumes in their companies will rise by more than 60 per cent by the end of 2014, with the average of all respondents anticipating a growth of no less than 42 per cent.

Sourced from: Analytics.CLUB #WEB Newsletter

A Timeline of Future Technologies 2019-2055

Our friends at Futurism uses the data from National Academy of Sciences, SmartThings Future Living reports, Scientific American, University of Bristol and several other sources to create this fascinating infographics.

A Timeline of Future Technologies 2019-2055
A Timeline of Future Technologies 2019-2055

originally posted @ https://futurism.com/images/things-to-come-a-timeline-of-future-technology-infographic/

Source by v1shal

Jul 12, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Statistically Significant  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> 4 Ways Big Data Will Change Every Business by analyticsweekpick

>> 3 Emerging Big Data Careers in an IoT-Focused World by kmartin

>> Analyzing Big Data: A Customer-Centric Approach by bobehayes

Wanna write? Click Here

[ NEWS BYTES]

>>
 Global Data Analytics Outsourcing Market- By Key Players, Type, Application, Region, and Forecast 2018-2025 – City Councilor Under  Financial Analytics

>>
 Briggs And Stratton Hiring For 65 Jobs In Wauwatosa – Patch.com Under  Sales Analytics

>>
 Daunting challenges of data security and privacy & how AI comes to rescue – Analytics India Magazine Under  Data Security

More NEWS ? Click Here

[ FEATURED COURSE]

Pattern Discovery in Data Mining

image

Learn the general concepts of data mining along with basic methodologies and applications. Then dive into one subfield in data mining: pattern discovery. Learn in-depth concepts, methods, and applications of pattern disc… more

[ FEATURED READ]

Hypothesis Testing: A Visual Introduction To Statistical Significance

image

Statistical significance is a way of determining if an outcome occurred by random chance, or did something cause that outcome to be different than the expected baseline. Statistical significance calculations find their … more

[ TIPS & TRICKS OF THE WEEK]

Data aids, not replace judgement
Data is a tool and means to help build a consensus to facilitate human decision-making but not replace it. Analysis converts data into information, information via context leads to insight. Insights lead to decision making which ultimately leads to outcomes that brings value. So, data is just the start, context and intuition plays a role.

[ DATA SCIENCE Q&A]

Q:How do you take millions of users with 100’s transactions each, amongst 10k’s of products and group the users together in meaningful segments?
A: 1. Some exploratory data analysis (get a first insight)

* Transactions by date
* Count of customers Vs number of items bought
* Total items Vs total basket per customer
* Total items Vs total basket per area

2.Create new features (per customer):

Counts:

* Total baskets (unique days)
* Total items
* Total spent
* Unique product id

Distributions:

* Items per basket
* Spent per basket
* Product id per basket
* Duration between visits
* Product preferences: proportion of items per product cat per basket

3. Too many features, dimension-reduction? PCA?

4. Clustering:

* PCA

5. Interpreting model fit
* View the clustering by principal component axis pairs PC1 Vs PC2, PC2 Vs PC1.
* Interpret each principal component regarding the linear combination it’s obtained from; example: PC1=spendy axis (proportion of baskets containing spendy items, raw counts of items and visits)

Source

[ VIDEO OF THE WEEK]

@JohnTLangton from @Wolters_Kluwer discussed his #AI Lead Startup Journey #FutureOfData #Podcast

 @JohnTLangton from @Wolters_Kluwer discussed his #AI Lead Startup Journey #FutureOfData #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

We chose it because we deal with huge amounts of data. Besides, it sounds really cool. – Larry Page

[ PODCAST OF THE WEEK]

@CRGutowski from @GE_Digital on Using #Analytics to #Transform Sales #FutureOfData #Podcast

 @CRGutowski from @GE_Digital on Using #Analytics to #Transform Sales #FutureOfData #Podcast

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Big data is a top business priority and drives enormous opportunity for business improvement. Wikibon’s own study projects that big data will be a $50 billion business by 2017.

Sourced from: Analytics.CLUB #WEB Newsletter

AtScale opens Hadoop’s big-data vaults to nonexpert business users

When it comes to business intelligence, most enterprise users are intimately acquainted with tools such as Microsoft Excel. They tend to feel less comfortable with data-management technologies like Hadoop—despite the considerable insights such tools could offer.

Enter AtScale, a startup that on Tuesday emerged from stealth with a new offering designed to designed to put those capabilities within closer reach. The AtScale Intelligence Platform is designed to enable interactive, multidimensional analyses on Hadoop from within standard BI tools such as Microsoft Excel, Tableau Software or QlikView, without the need for any data movement, custom drivers or a separate cluster.

“Today, millions of information workers could derive value from Hadoop, but their organizations have not been able to empower them to do so, either because their current toolset doesn’t work natively with Hadoop or because IT doesn’t have the tools to provision them with secure, self-service access,” said Dave Mariani, AtScale’s founder and CEO.

In essence, AtScale’s platform aims to give business users the ability to analyze in real time the entirety of their Hadoop data—tapping Hadoop SQL engines like Hive, Impala and Spark SQL—using the BI tools they are already familiar with. In that way, its intent is similar in many ways to that of Oracle, which recently unveiled new big-data tools of its own for nonexperts.

AtScale’s software strives to make big-data analytics accessible in several ways. Its cube designer, for instance, converts Hadoop into interactive OLAP cubes with full support for arrays, structs and non-scalars, enabling complex data to be converted into measures and dimensions that anyone can understand and manage, the company says.

“We have a community of more than 110 million users and a massive amount of data about how people play our games,” said Craig Fryar, head of business intelligence at Wargaming, creator of online game World of Tanks. “Our cluster stores billions of events that we can now easily explore in just a few clicks.

Originally posted via “AtScale opens Hadoop’s big-data vaults to nonexpert business users”

Source by analyticsweekpick

Jul 05, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Tour of Accounting  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ NEWS BYTES]

>>
 NIH Data Science Plan Aims to Boost Data Analytics, Access – Health IT Analytics Under  Health Analytics

>>
 Much More Than Toys – Why Organizations Need to Industrialize the Data Science Playground – Datamation Under  Data Science

>>
 ‘Sexiest job’ ignites talent wars as demand for data geeks soars – Chicago Tribune Under  Data Scientist

More NEWS ? Click Here

[ FEATURED COURSE]

Pattern Discovery in Data Mining

image

Learn the general concepts of data mining along with basic methodologies and applications. Then dive into one subfield in data mining: pattern discovery. Learn in-depth concepts, methods, and applications of pattern disc… more

[ FEATURED READ]

The Future of the Professions: How Technology Will Transform the Work of Human Experts

image

This book predicts the decline of today’s professions and describes the people and systems that will replace them. In an Internet society, according to Richard Susskind and Daniel Susskind, we will neither need nor want … more

[ TIPS & TRICKS OF THE WEEK]

Strong business case could save your project
Like anything in corporate culture, the project is oftentimes about the business, not the technology. With data analysis, the same type of thinking goes. It’s not always about the technicality but about the business implications. Data science project success criteria should include project management success criteria as well. This will ensure smooth adoption, easy buy-ins, room for wins and co-operating stakeholders. So, a good data scientist should also possess some qualities of a good project manager.

[ DATA SCIENCE Q&A]

Q:Provide examples of machine-to-machine communications?
A: Telemedicine
– Heart patients wear specialized monitor which gather information regarding heart state
– The collected data is sent to an electronic implanted device which sends back electric shocks to the patient for correcting incorrect rhythms

Product restocking
– Vending machines are capable of messaging the distributor whenever an item is running out of stock

Source

[ VIDEO OF THE WEEK]

@chrisbishop on futurist's lens on #JobsOfFuture #FutureofWork #JobsOfFuture #Podcast

 @chrisbishop on futurist’s lens on #JobsOfFuture #FutureofWork #JobsOfFuture #Podcast

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Data beats emotions. – Sean Rad, founder of Ad.ly

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with  John Young, @Epsilonmktg

 #BigData @AnalyticsWeek #FutureOfData #Podcast with John Young, @Epsilonmktg

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

IDC Estimates that by 2020,business transactions on the internet- business-to-business and business-to-consumer – will reach 450 billion per day.

Sourced from: Analytics.CLUB #WEB Newsletter

May 15, 2017 Health and Biotech analytics news roundup

NHS taps artificial intelligence to crack cancer detection: The UK’s health service, Intel, and the University of Warwick are collaborating to form a ‘digital repository’ of known tumor cells, which they hope to use to develop new algorithms for classification of those cells.

When Cancer Patients Should Ask For Genetic Sequencing: At Memorial Sloan Kettering Cancer Center, more than 10,000 tumor biopsies from patients with advanced cancer were sequenced. Many of the patients had mutations that could be addressed with current treatments

Intermountain makes strides in precision medicine, advanced imaging: The Utah-based health system has added technology to perform specific and sensitive sequencing of solid tumor cells as well as imaging technology to monitor wounds.

How providers can use analytics to manage risk in value-based care: Two examples are presented: a Maryland hospital and a New York physician’s network.

Source: May 15, 2017 Health and Biotech analytics news roundup by pstein

Jun 28, 18: #AnalyticsClub #Newsletter (Events, Tips, News & more..)

[  COVER OF THE WEEK ]

image
Accuracy check  Source

[ LOCAL EVENTS & SESSIONS]

More WEB events? Click Here

[ AnalyticsWeek BYTES]

>> Map of US Hospitals and their Process of Care Metrics by bobehayes

>> February 27, 2017 Health and Biotech analytics news roundup by pstein

>> Big data analytics startup Sqrrl raises $7M by analyticsweekpick

Wanna write? Click Here

[ NEWS BYTES]

>>
 Lessons learnt from machine learning in fraud management – Enterprise Innovation Under  Machine Learning

>>
 Northrop Grumman in $7.3M defense contract for ‘machine learning … – Newsday Under  Machine Learning

>>
 The Vaping World: How Many Vapers Are There? – Vaping Daily (blog) Under  Statistics

More NEWS ? Click Here

[ FEATURED COURSE]

Deep Learning Prerequisites: The Numpy Stack in Python

image

The Numpy, Scipy, Pandas, and Matplotlib stack: prep for deep learning, machine learning, and artificial intelligence… more

[ FEATURED READ]

Rise of the Robots: Technology and the Threat of a Jobless Future

image

What are the jobs of the future? How many will there be? And who will have them? As technology continues to accelerate and machines begin taking care of themselves, fewer people will be necessary. Artificial intelligence… more

[ TIPS & TRICKS OF THE WEEK]

Data aids, not replace judgement
Data is a tool and means to help build a consensus to facilitate human decision-making but not replace it. Analysis converts data into information, information via context leads to insight. Insights lead to decision making which ultimately leads to outcomes that brings value. So, data is just the start, context and intuition plays a role.

[ DATA SCIENCE Q&A]

Q:Compare R and Python
A: R
– Focuses on better, user friendly data analysis, statistics and graphical models
– The closer you are to statistics, data science and research, the more you might prefer R
– Statistical models can be written with only a few lines in R
– The same piece of functionality can be written in several ways in R
– Mainly used for standalone computing or analysis on individual servers
– Large number of packages, for anything!

Python
– Used by programmers that want to delve into data science
– The closer you are working in an engineering environment, the more you might prefer Python
– Coding and debugging is easier mainly because of the nice syntax
– Any piece of functionality is always written the same way in Python
– When data analysis needs to be implemented with web apps
– Good tool to implement algorithms for production use

Source

[ VIDEO OF THE WEEK]

Decision-Making: The Last Mile of Analytics and Visualization

 Decision-Making: The Last Mile of Analytics and Visualization

Subscribe to  Youtube

[ QUOTE OF THE WEEK]

Without big data, you are blind and deaf and in the middle of a freeway. – Geoffrey Moore

[ PODCAST OF THE WEEK]

#BigData @AnalyticsWeek #FutureOfData #Podcast with Dr. Nipa Basu, @DnBUS

 #BigData @AnalyticsWeek #FutureOfData #Podcast with Dr. Nipa Basu, @DnBUS

Subscribe 

iTunes  GooglePlay

[ FACT OF THE WEEK]

Facebook stores, accesses, and analyzes 30+ Petabytes of user generated data.

Sourced from: Analytics.CLUB #WEB Newsletter

Slow progress forces Navy to change strategies for cloud, data centers

computer-data-center
The Department of the Navy isn’t making as much progress on data center consolidation and moving to the cloud as it wants to. So the Navy is moving the initiatives under a new owner and coming down hard on those who are standing in the way.

“Later this year, we will make an organizational change to our approach to data center consolidation. The Data Center and Application Optimization (DCAO) program office will move from under Space and Naval Warfare Systems Command (SPAWAR) headquarters to under Program Executive Office-Enterprise Information Systems (PEO-EIS) as a separate entity or program office,” said John Zangardi, the Navy’s deputy assistant secretary for command, control, computers, intelligence, information operations and space and acting chief information officer. “This will better align consolidation efforts with network efforts and more fully leverage the Next Generation Enterprise Network (NGEN) contract.

So we will build on their application experience. The DCAO will be responsible for establishing a working model for Navy cloud hosting service brokerage. This will be for the delivery of application hosting via commercial and federal agencies. Culturally, we have to make this shift from a mistaken belief that all our data has to be near us and somewhere where I can do and hug the server, instead of someplace where I don’t know in the cloud. This is a big shift for many within the department. It’s not going to be an easy transition.”

Since 2012, the Navy has made some progress. Zangardi, who spoke at the 14th annual Naval IT Day sponsored by AFCEA’s Northern Virginia chapter, said over the last three years, the Navy has consolidated 290 systems and apps across 45 sites. But overall, he said getting bases and commands to move faster just isn’t happening.

The Navy plans to officially move the data center consolidation office into the PEO-EIS office in July.

Testing the cloud access point

Knowing the difficulties and challenges over the past few years, Zangardi said he’s taking several steps to help ease the pain.

First, he said his office picked three data centers that are lagging behind and required them to develop a plan to consolidate and move their data to a centralized data center.

Second, the Navy is rationalizing large scale apps. Zangardi said too often people hold their applications and servers close.

“I spend a lot of time thinking about the cloud access point (CAP) and our data centers. My objective is to move stuff as quickly as possible. The applications we are looking at right now to move to our cloud access point, the ones I’m most interested in moving right now, would come out of the N4 world, so we are talking about things like maintenance or aviation type of stuff so think logistics,” he said. “We’re also looking at enterprise resource planning (ERP). Can we move our ERP to a cloud type of solution to drive in more efficiencies? I think most of the things we are looking at, at least upfront, would be business sort of applications.”

The third way to ease the pain is by using pilot programs to get commands and bases comfortable with the idea of letting go of their servers and data.

“PEO-EIS and SPAWAR Systems Center Atlantic are piloting a cloud access point in conjunction with the commercial cloud service provider. It’s currently operating under an interim authority to test,” Zangardi said. “These organizations have the right expertise to develop the approach for the department to leverage the cloud. However, the CAP pilot is in its early stages. Essentially right now we are doing table top testing.

Our objective over the next year is to move from a pilot effort to what I would term a productionized commercial cloud. What do I mean by productionized? Simply to me it means an industry leveraged approach that can scale to demand from users. This capability should be secure, provide lower costs for storage and data and facilitate mobility.”

One big question about this consolidation effort is how to break out the 17 or 19 data centers that fall under the NGEN Intranet, and put them under the PEO- EIS team with the other data centers.

Zangardi said the Navy is considering an approach to this, but it’s still in the early stages.

Private cloud works just fine

While the Navy is open to using commercial or public clouds, the Marine Corps is going its own way.

Several Marine Corps IT executives seemed signal that the organization will follow closely to what the Navy is doing, but put their own twist on the initiative.

One often talked about example of this is the Marines decision to not move to the Joint Regional Security Stacks (JRSS) that is part of the Joint Information Environment (JIE) until at least version 2 comes online in 2017. Marine Corps CIO Gen. Kevin Nally said the decision not use the initial versions of JRSS is because Marine Corps’ current security set up is better and cheaper than version 1 or 1.5.

More see – http://www.federalnewsradio.com/412/3857833/Slow-progress-forces-Navy-to-change-strategies-for-cloud-data-centers

Originally Posted at: Slow progress forces Navy to change strategies for cloud, data centers