Big Data Hadoop Consultant
The digital revolution is changing everything. It’s everywhere – transforming how we work and play. Are you reacting to the disruption each day or are you leading the way as a digital disrupter? Accenture Digital is driving these exciting changes and bringing them to life across 40 industries in more than 120 countries. At the forefront of digital, you’ll create it, own it and make it a reality for clients looking to better serve their connected customers and operate always-on enterprises. Join us and become an integral part of our experienced digital team with the credibility, expertise and insight clients depend on.
Accenture Digital is powered by three practices –Mobility, Interactive, and Analytics. As part of our Analytics practice, you’ll deliver analytically-informed, issue-based solutions that help clients make faster, smarter decisions. You’ll play a critical role in helping them tackle complex business issues.
Do you have a pulse on new technologies and a desire to change the way business gets done? Do you want to implement emerging solutions for some of the most successful companies around? If you answered yes to these questions and you are passionate about helping clients effectively manage enormous amounts of data to generate knowledge and value, then we want to meet you.
YOUR ROLE: Analytics Delivery – Big Data Consultant
There will never be a typical day at Accenture Analytics, but that’s why people love it here. The opportunities to make a difference within exciting client initiatives are unlimited in the ever-changing digital landscape. Here are just a few of your day-to-day responsibilities.
· Deliver large-scale programs that integrate processes with technology to help clients achieve high performance.
· Design, implement and deploy custom applications on Hadoop.
· Implementation of complete Big Data solutions, including data acquisition, storage, transformation, and analysis.
· Design, implement and deploy ETL to load data into Hadoop.
YOUR EXPERIENCE: Basic Qualifications
Minimum 1 year of Building and deploying Java applications
Minimum 1 year of building and coding applications using at least two Hadoop components, such as MapReduce, HDFS, Hbase, Pig, Hive, Spark, Sqoop, Flume, etc
Minimum 1 year of coding, including one of the following: Python, Pig programming, Hadoop Streaming, HiveQL
Minimum 1 year implementing relational data models
Minimum 1 year understanding of traditional ETL tools & RDBMS
Minimum of a Bachelor’s Degree or 3 years IT/Programming experience
SET YOURSELF APART: Preferred Qualifications
• Full life cycle Development
• Minimum 1 year of experience Developing REST web services
• Industry experience (financial services, resources, healthcare, government, products, communications, high tech)
• Experience leading teams
• Data Science and Analytics (machine learning, analytical models, MAHOUT, etc.)
• Data Visualization