Job Search Engine Jobdyn


Only 2 days left! Apply now in only 2 Minutes for FREE.
Close window
Newsletter: Privacy Policy

Big Data Developer


Summary Sheet: I.T. & Communications

    
Advertiser NameMphasisAdvertiser Type:Agency
Classification:I.T. & CommunicationsSubclassification:
Country:CanadaLocation:Canada
Language:English - United Kingdom (en-GB) Contact Name:
Employment Type:PermanentWorkhours:Full Time



Position: Big Data Developer


Description: Job Title: Big Data Developer


Location: Calgary, AB


Experience: 7-10 Years


No. of Positions: 5


What will you do?


- Automate, manage, and evolve key business data pipelines (batch and streaming) to deliver value to business users and stakeholders


- Work with cloud technologies in AWS to implement data platform functionality and toolset (S3, EC2, Lambda, ECS, EMR, Glue, SageMaker, Redshift)


- Work with big data technologies to support the needs of the data teams (Spark, Hadoop, Hive, Neo4j and Kafka)


- Manage deployments via automated infrastructure-as-code pipelines and container-based technologies (such as Docker, ECS, and Kubernetes)


- Oversee our developed solutions for batch and real-time data and analytics use cases


- Collaborate with data architects to determine which data management systems to use, data scientists to identify data for analysis, and IT team members on project goals


- Work closely with development teams to learn about requirements, workflows, and processes, and to promote best practices across the Enterprise


- Continually enhance skills and build knowledge in all aspects of the organization, the business, and information systems


- Work closely with stakeholders to ensure successful data asset design and development


- Join data across multiple data environments, such as HDFS, S3, and Data Warehouses, using complex optimized queries


Must-have


- Experience with AWS Cloud Formation and automation of services


- Bachelors in Computer Science, Software Engineering, BSc or above in software engineering, computer engineering, computer science, or related field


- 5+ years of Cloud that includes big data ecosystem, with Hadoop (Pig, Hive, HDFS), Apache Spark, Java, Python, and NoSQL/SQL databases


- Experience working with Cloud platforms such as AWS, Azure or Google Cloud; big data pipelines and management


- Experience with data security for data at rest and in transit (firewalls, hashing, encryption, SSL)


- Experience collaborating within a software development team using standard DevOps tools and CI/CD practices


- Experience using ETL big data pipelines (Apache Airflow), knowledge of CI workflows and build/test automation


- Good knowledge of popular data standards and formats (JSON, XML, Parquet, Avro, etc.)


- Work on a cross-functional Agile team responsible for end-to-end delivery of business needs


- Help improve data management processes - acquiring, transforming and storing massive volumes of structured and unstructured data


Nice-to-have


Experience with other analytics programming languages (Python and R)


Experience with other data analytics and visualization tools such as Tableau


Experience with Agile software development


Knowledge of IT Security issues and risks related to cloud services


About Mphasis


Mphasis is a leading IT solutions provider, offering Applications, Business Process Outsourcing (BPO) and Infrastructure services globally through a combination of technology knowhow, domain and process expertise.


Job Type: Full-time


Schedule:
8 hour shift


Work Location: One location




List of Jobs -  Sitemap -  jobdyn.com
web-set.com