This job board retrieves part of its jobs from: Brampton Jobs | Commis d'Entrepôt | Work From Home

The only jobs board exclusive to the people of Hamilton, ON

To post a job, login or create an account |  Post a Job

   Hamilton Jobs   

No need to go to Toronto for the job of your dreams. Find a job near home!

previous arrow
next arrow
Slider

Lead Data Engineer

Humanify360

This is a Contract position in Hamilton, ON posted June 4, 2021.

Voici un poste distance.What we doAs a critical part of the electrical infrastructure of Quebec, we have the mission to build a highly resilient and scalable IoT platform in which we will integrate a wide range of edge devices.

We develop micro-services, mobile apps and hardware ranging from a centralized cloud energy demand management system which broadcasts orders to edge devices through a hardware gateway in customer houses, to systems handling OTA and commands to edge devices via Zigbee and everything in between, from mobile apps to web management consoles.

We are a dynamic team of electrical engineers, embedded software developers, cloud developers, operations specialists, QA, product owners, designers (and more) using a DevOps and TDD mentality.

Sounds fun?

Join us to change the worlds energy landscapeAbout the roleReporting to the Vice-President, Technology Working with the VP Tech, QA and the Coach & Machine Learning product group, construct the roadmap for the next 3-6 months.

Participate in building out and managing the Data engineering team.

Ensure the product vision is implemented cleanly and precisely while balancing it with minimizing technical debt and sound architectural decisions.

Be the go-to technical person for the full Data engineering product stack, including API interfaces, tooling and all other required developments.

Ensure that scalability, security, maintainability and stability concerns are solidly reflected in code.

Foster an open and collaborative environment.

Lead collaborative teams to achieve common goals Push for TDD, SCRUM, SOLID principles, DevOps, GitFlow and CI/CD.

Be a champion for DevOps principles while understanding that Agile doesn””””t mean you can skip planning and that planning and Agile can coexist in harmony.

Act as the technical escalation point / reference for the software team Day-to-day engaged technical leadership, including driving architecture, design, code reviews, documentation, branching strategies and technology selection.

Bring an entrepreneurial mindset, openness, transparency, and collegiality to your everyday work.

Possess a craftsman””””s pride in the code the team put out.

Prioritizes and values quality over quantity but is not a zealot or perfectionist, understanding that ultimately the code must meet the needs of the business.

Excellent communication with peers and stakeholders; is transparent and data-driven, and knows how to give good news, bad news and listen.

Must have a great attitude and be an unflappable team member known for the ability to embrace a challenge while keeping the presence of mind to have fun along the way.

Must like the challenge of simplifying complex systems and always consider the big picture when acting locally.

What youll doYou will be responsible for leading the team designing the data infrastructure supporting our Machine Learning, Analytics and BI use cases.

You will interact with other teams dealing with operational data for our smart home, smart building, virtual power plant and time series storage to determine the best way to zone, store, clean, process and augment it for our various use cases.

You will also be responsible of tracking and respecting budgets for the cloud ecosystems you develop.

ProfilWhat wed like to see Design and deploy Cloud ETL/ELT solutions.

Hands-on experience building, operating, and monitoring a data analytics platform in a production environment.

Optimize existing pipelines and maintenance of all domain-related data pipelines.

Securely source external data for data enrichment purpose.

Build data expertise and own data quality.

Coding experience (Production-level) and strong knowledge in Python, SQL, Rest APIs.

Experience in designing and implementing data and machine learning architectures based on streaming data technologies for low-latency data processing (Storm, Apache Spark/Flink, Apache Kafka, RabbitMQ, Hadoop ecosystem, Azure Stream Analytics, Amazon Kinesis, etc.) Experience with building and maintaining ETL pipelines, Data Warehousing and Dimensional Modeling with relational and MPP/distributed systems.

Experience with building off-line data processing architectures to extract inferences, correlations, outliers and other mathematical concepts/objects from data in a cost-efficient manner.

Good knowledge of purpose-built databases and languages
– relational (e.g.

Oracle PL/SQL, Microsoft T-SQL, PostgreSQL), columnar (e.g.

AWS Redshift), in-memory (Redis), key-value (ElasticSearch, Apache Cassandra, CosmosDB etc), Graph (Gremlin, neo4j, sparql) and optimization thereof.

Experience working with and designing complex data schemas Experience with SQL query performance optimization Experience with Spark performance optimization and troubleshooting Experience with event driven architectures and message processing with message brokers such as Kafka.

Implemented one or many of Redshift, Snowflake, Azure Data Warehouse, ADLS, S3, Kafka, Presto, EMR, Databricks, or Data Lake Architecture in one or more public clouds.

Developed Big Data/Hadoop applications using Spark, Hive, HBase, Presto or Map Reduce.

Experience using CI/CD methodologies within the data space.

Things that help
– University diploma in Comp Sci, Mathematics, Software Engineering or any relevant discipline
– Familiar with distributed architecture
– Knowledge of API development is a plus (REST, swagger, OpenAPI etc.)
– Knowledge of Atlassian suite, Visual Studio, collaborative tools, Git, Sonar or other quality tools.
– Good knowledge of SDLC.
– Knowledge of Terraform
– Experience with micro-services, their database implications and how to cleanly decouple services.
– Knowledge of performance and profiling tools (profilers, Retrace / New Relic / Stackify, application insights, azure monitor)