Job Description :
Are you looking for unlimited opportunities to develop and succeed? With work that challenges and makes a difference, within a flexible and supportive environment, we can help our customers achieve their dreams and aspirations.
Are you interested to work for a large multinational that puts people first and where showing your humanity is truly valued?
Manulife has a strong culture of transparent communication and teamwork, with empowerment and working flexibility embedded in our culture.
We fully embrace open source and AI technology. Manulife believes great ideas can come from anywhere and anyone, we are looking for talented technologists who want to use their passion and interests to work with other passionate and diverse individuals to help us become a digital customer leader.
Our employees are able to develop their career and skill set to executive levels without stepping away from the frontline of technological innovation through our IT Careers Framework, Manulife University and Pluralsight, our technology skills platform.
Manulife is also committed to adopting Agile as a mindset across our businesses. If new ways of working with beer and pizza fueled hackathons, an innovative approach to problem solving, change, process, system improvement and continuous learning excite you, then please do get in touch.
It's also important to note that we welcome IT talents with a non IT background, even a degree is not a must have, if you're excited by the proposition and you think you've got what it takes - follow the link!
Designs and ensures consistent data flow through the organization; works closely with data scientists and data engineers on moving data analysis models into production
Assesses middleware tools for data integration, transformation and routing
Maintains data warehouse performance by identifying and resolving conflicts
Utilizes a variety of data interchange formats to ensure that data requirements for the business are met
Monitors mapping and data integrity across the organization
Designs and promotes the effective use of data querying APIs to provide data to supply data demand with easy access to organizational data resources
Deployment of machine learning models in production
Understanding of relational and warehousing database technology working with at least one of the major databases platforms (e.
g., Oracle, SQLServer, Teradata, MySQL, or Postgres)
Practical experience with big data processing frameworks and techniques such as HDFS, MapReduce, Storage formats (Avro, Parquet), Stream processing, etc.
ETL experience with tools such as Informatica
Solid working knowledge of data processing tools using SQL, Spark, Python or similar open source and commercial technologies
Knowledge of financial services business requirements and needs
Knowledge of IT industry trends - specifically those related data; understanding of data management, from RDBMS to NoSQL data models
Experience in the areas master data management, data warehousing and analytics is desired. Any experience with Big Data will be an asset.
Knowledge of Java / Scala especially in relation to big data open source software preferred; knowledge of NiFi
Knowledge of non-relational (Cassandra, MongoDB) databases preferred
Predictive analytics and machine learning experience (scikit-learn, Tensorflow, MLlib, recommendation systems) preferred
Experience with integrating to back-end / legacy environments
Experience with industries such as FIs, INS, High Tech and Retail / CPG
Knowledge and familiarity with machine learning models application and production pipelines
Collaborative attitude, willingness to work with team members; able to coach, participate in code reviews, share skills and methods
Constantly learns from both success and failure
Good organizational and problem-solving abilities that enable you to manage through creative abrasion
Good verbal and written communication; able to effectively articulate technical vision, possibilities, and outcomes
Experiments with emerging technologies and understanding how they will impact what comes next
BSc in Computer Science, Statistics, Informatics, Information System, Mathematics or equivalent quantitative field preferred
Experience in using one or more DBMS : Oracle, PostgreSQL, MySQL, Greenplum, HP Vertica