Jul 2012
23 Mon
24 Tue
25 Wed
26 Thu
27 Fri 09:30 AM – 05:30 PM IST
28 Sat 09:30 AM – 05:00 PM IST
29 Sun
Kushwaha Manish Kaushal
To give insights of the problems and solutions if you are working on very high volume of data (~ 330 TB of data). Problems involves with Hardware Infrastructure, and in functional treatment. BigData problem increases further if your data collection size going up by 10% per month. Solution through Hadoop eco-system.
We at Pubmatic are handling more then 330 TB of data using Apache Hadoop Eco-System. Handaled many of burning issues with Hadoop itself using available open source. By combining many components of Hadoop we have developed our “On the fly Analytic” platform. This address many analytic functional space.
I would like to cover how we are able to tackle efficiently huge set of data at our company with no cost on software using commodity servers. What are day to day problems in handling a big cluster of Hadoop and generic solution on those problems. Few use-case of analytic which requires huge data churning and joins between different set of data.
Manish Kaushal, Principal Architect, Pubmatic.
Handle Analytics initiative and hadoop eco-system.
Past:
Sr. R&D Engineer at Nokia Siemens Network, Handled AdServer projects which required large data handling.
Sr. Lead Engineer, Motorola, Handled various telcom, and retail loyalty programs, All of those programs required handling of BigData.
{{ gettext('Login to leave a comment') }}
{{ gettext('Post a comment…') }}{{ errorMsg }}
{{ gettext('No comments posted yet') }}