The Fifth Elephant 2015
A conference on data, machine learning, and distributed and parallel computing
Jul 2015
13 Mon
14 Tue
15 Wed
16 Thu 08:30 AM – 06:35 PM IST
17 Fri 08:30 AM – 06:30 PM IST
18 Sat 09:00 AM – 06:30 PM IST
19 Sun
Machine Learning, Distributed and Parallel Computing, and High-performance Computing are the themes for this year’s edition of Fifth Elephant.
The deadline for submitting a proposal is 15th June 2015
We are looking for talks and workshops from academics and practitioners who are in the business of making sense of data, big and small.
This track is about general, novel, fundamental, and advanced techniques for making sense of data and driving decisions from data. This could encompass applications of the following ML paradigms:
Across various data modalities including multi-variate, text, speech, time series, images, video, transactions, etc.
This track is about tools and processes for collecting, indexing, and processing vast amounts of data. The theme includes:
HasGeek believes in open source as the binding force of our community. If you are describing a codebase for developers to work with, we’d like it to be available under a permissive open source license. If your software is commercially licensed or available under a combination of commercial and restrictive open source licenses (such as the various forms of the GPL), please consider picking up a sponsorship. We recognize that there are valid reasons for commercial licensing, but ask that you support us in return for giving you an audience. Your session will be marked on the schedule as a sponsored session.
If you are interested in conducting a hands-on session on any of the topics falling under the themes of the two tracks described above, please submit a proposal under the workshops section. We also need you to tell us about your past experience in teaching and/or conducting workshops.
Hosted by
Manoj Sundaram
@manojsundaram
Submitted May 27, 2015
Hadoop was originally developed for crawling the Internet and indexing - where security is not a concern. But we have come a long way since then. Major banks and organizations are adopting Hadoop as their preferred Big Data platform and there is a growing emphasis on securing the Data and the Cluster components/resources. In a complicated, distributed system like Hadoop, there are several attack vectors that need to be mitigated and companies need to comply with standards like PCI, SOX, HIPAA and so on.
This session will help the audience understand the different levels at which security can be enforced to truly protect your data and your Hadoop cluster.
Data Breaches are costly to any organization and the number of incidents keep going up every year. For a nice visual of some of the largest data breahes in histroy, please check the below URL, I was able to find on the internet.
http://www.informationisbeautiful.net/visualizations/worlds-biggest-data-breaches-hacks/
http://www.networkworld.com/article/2861023/security0/worst-security-breaches-of-the-year-2014-sony-tops-the-list.html
Okay, we get it. Secuirty threats are everywhere and can occur anytime, but how do we alleviate them? This is what I will be covering in my talk.
High level overview of topics to be covered:
Manoj leads the APAC team at Intel for the Big Data Solution Enablement Organization. He works with customers on a daily basis to help them design, plan and deploy Hadoop clusters; securing the platform; troubleshooting and performance tuning.
He is an expert in Hadoop operations, Operating Systems, Networks and Security administration. He is also a Red Hat Certified Architect (RHCA), Red hat Certified Security specialist (RHCSS), Red hat certified Datacenter Specialist (RHCDS) and a Cloudera Certified Administrator in Hadoop (CCAH).
More details @ https://in.linkedin.com/in/manojsundaram
Hosted by
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}