Become a Quickplayer today and grow with us!
QuickPlay, now part of AT&T is the premier provider of solutions to manage the business of mobile video. Recognized internationally for innovation, flexibility and excellence QuickPlay provides the fastest and most flexible way for companies to deliver mobile video worth watching. Today, QuickPlay powers high quality TV, video, audio, and radio services for 14 mobile operators and 150 content providers recognized worldwide.
In an evolving market, our employees are inspired by the innovative environment that allows them to lead, motivate, and create, while reaching their full potential and achieving great results. Spending their days in a challenging atmosphere developing cutting edge products for the biggest names in media and communications, QuickPlay employees are able to expand their skills and grow with a passionate and talented group of people.
Read about some of the things we’re doing here: http://www.quickplay.com
This is a highly technical Big Data architect position who will be responsible for realizing our Big Data strategy with strong architecture, design, and implementation skills and will be a key though leader in many Big Data disciplines, such as real-time streaming and processing, as well as off-line analytical batch processing.
Roles and responsibilities
- This is a technical leadership role in the architecture, design and delivery of key software systems and features.
- Oversee the design and architecture of Big Data applications
- Design and develop components identified in our platform product [DAP]
- Size and estimate optimal Big Data infrastructure to provide high performance data pipelines while maintaining cost efficiency
- Guide development teams in the Big Data architectural frameworks and best practices
- Oversee the deployment and sustainment of extensive Big Data pipelines and interfaces in a highly replicated, multi-location environment
- Utilize Java, distributed systems and big data technologies to develop high-performance, scalable systems and data pipelines for use within the AEG ’s web services systems
- Research new technologies and approaches for presenting key business insights by analyzing Big Data
- Proactively drive new approaches and techniques for the Big Data platform as well as propose improvements in software development practices, software quality, and software efficiencies
- Ability to test and prove concepts through rapid prototyping
- Define our data governance strategy
Key competencies and Skills
- Strong experience with Hadoop, HDFS, Kafka, Flume and related big data systems, particularly with the Hortonworks Data Platform
- Extensive programming experience in Java, R, and Scala
- Advanced skills using one or more scripting languages (e.g. Python, UNIX shell scripts)
- Extensive background developing on Linux systems
- Expert knowledge of agile development methodology
- Proven ability to lead teams both local and offsite
- Extensive experience with cloud hosted applications (i.e. Rackspace, AWS, Azure, HP)
- Experience with integrating open-source technology into large enterprise platforms
Education and qualifications
- 5-10 years of Big Data experience with a Bachelor degree in Computer Science or Engineering
- Periodic travel may be required