Summary Sheet: I.T. & Communications
| || || || |
|Advertiser Name||Request Technology - Anthony Honquest||Advertiser Type:||Agency|
|Classification:||I.T. & Communications||Subclassification:|
|Country:||United States||Location:||United States|
|Language:||English - United Kingdom (en-GB) ||Contact Name:||>Anthony Honquest|
|Employment Type:||Permanent||Workhours:||Full Time|
Position: Big Data Engineer
Big Data Engineering incorporates DevOps techniques across many disciplines - including mathematics/statistics, computer programming, data engineering and ETL, software development, and high performance computing - with traditional business expertise, with the goal of extracting meaning from data to optimize future business decisions. Individuals in this field should be lifelong learners with a desire to become experts in several of these disciplines and sufficiently proficient in others to effectively design, build, and deliver analytics products to optimize future decisions.
The Big Data Engineer job family is accountable for DevOps engineering of data solutions which includes designing and building systems for data storage and analytics that enable better decisions to achieve Company's goals
The Individual must demonstrate sufficient adaptability to quickly develop new skills across these disciplines as those disciplines evolve as well as assist in the selection and development of other team members.
This role is responsible for a DevOps approach to development of new systems for analysing data; the coding & development of advanced analytics solutions to make/optimize business decisions and processes; integrating new tools to improve analytics; and address new technical challenges using existing and emerging technology solutions.
- Executes complex functional work tracks and drives the execution of operational/technical objectives for data analytic outputs and business solutions.
- Partners with other internal teams and peers in the department to ensure holistic Big Data solutions meet the needs of various stakeholders.
- With coaching, can identify new areas of data, research and big data technology that can solve business problems
- Leverages and uses Big Data best practices to develop technical solutions used for analytical insights.
- Acts as an Influencer within the department on the effectiveness of Big Data solutions to solve their business problems
- Supports Innovation; regularly provides new ideas to help people, process, and technology that interact with analytic ecosystem.
- With coaching, develops and builds frameworks/prototypes that integrate big data and advanced analytics to make better business decisions.
- Executes on Big Data requests to improve the accuracy, security, quality, completeness, speed of data, and decisions made from Big Data analysis.
- Uses, learns, teaches, and supports a wide variety of Big Data and Data Science tools to achieve results (ie, R, ETL Tools, Hadoop, and others).
- Uses, learns, teaches, and supports a wide variety of programming languages on Big Data and Data Science work (ie Java, C#, Python, and Perl)
- Supports a clear communication strategy that keeps all relevant stakeholders informed and provides an opportunity to influence the direction of the work
- Trains and develops other engineers.
- 2 - 5 Years of experience as a Big Data Engineer.
- Bachelor's Degree in Computer Science, MIS, or related area, or equivalent work experience. Master's Degree in a quantitative or scientific field would be a plus.
- Experience in using software development to drive data science & analytic efforts
- Experience with database integration, dataflow management & ETL technologies
- Experience with various data types (eg Relational, Unstructured, Hierarchical, Linked Graph Data)
- Experience in developing, managing, and manipulating large, complex datasets
- Understanding of security risks and vulnerabilities pertaining to open source systems leveraging tools and techniques to minimize risk. Where appropriate, provide recommendations and justifications to ensure speed of access while minimizing risk for scientists and developers.
- Experience and solid understanding of Bigdata ecosystems such as Hadoop, Spark, Streaming, Kafka
- Ability to code and develop prototypes in languages such as Python, Scala, Java, C, R, SQL
- Ability to communicate and present advanced technical topics to general audiences including teams across multiple time zones.
- Leading project teams of various skills levels
- Understanding of predictive modelling techniques, a plus
- Automation, Configuration Management (eg Ansible, Puppet), Dev-ops practices, CI/CD pipelines (eg Jenkins).
- Elementary networking skills, switching, routing, Firewalls, load balancing.
- Linux Containers/Docker.