keywords/skills

Showing page 13 of 54 (10 in 0.86 seconds)

  • Coventry
  • Salary negotiable
    • Temp
  • 10 Mar 2020

Job Title: Contract - Data Migration Engineer

Day Rate: £Negotiable (Inside IR35)

Team: Technical Operations

Office: Coventry

Duration: 6-12 Months

Shell Energy has partnered exclusively with Robert Walters to find an experienced Contract Data Migration Engineer in Coventry supporting on a migration project.

About Shell Energy

At Shell Energy, we provide more and cleaner energy solutions across a global portfolio ofgas,powerandenvironmental productsto meet current and future energy needs of our customers: energy producers, asset owners, traders, wholesalers and large commercial and industrial energy users.

Your part in our journey:

You will be working closely with our internal customers to ensure well architected cloud services are delivered that meet fast changing demands.

This is working on a greenfield migration project.

Skills

  • Expert Level of experience of SQL Server
  • Cloud experience preferably AWS but open to people with experience with Azure and GCP.

Desirable

  • Experience with Big Data, Hadoop
  • Snowflake
  • Data Bricks
  • StreamSet
  • AWS certifications and references of work on AWS migration projects would be favourable.

To find out more about the opportunity, or to apply, please contact Ash Ali at

All direct and third-party applications will be forwarded to our exclusive recruitment partner Robert Walters for their consideration.

  • Coventry
  • Salary negotiable
    • Temp
  • 10 Mar 2020

Job Title: Contract - Data Migration Engineer

Day Rate: £Negotiable (Inside IR35)

Team: Technical Operations

Office: Coventry

Duration: 6-12 Months

Shell Energy has partnered exclusively with Robert Walters to find an experienced Contract Data Migration Engineer in Coventry supporting on a migration project.

About Shell Energy

At Shell Energy, we provide more and cleaner energy solutions across a global portfolio ofgas,powerandenvironmental productsto meet current and future energy needs of our customers: energy producers, asset owners, traders, wholesalers and large commercial and industrial energy users.

Your part in our journey:

You will be working closely with our internal customers to ensure well architected cloud services are delivered that meet fast changing demands.

This is working on a greenfield migration project.

Skills

  • Expert Level of experience of SQL Server
  • Cloud experience preferably AWS but open to people with experience with Azure and GCP.

Desirable

  • Experience with Big Data, Hadoop
  • Snowflake
  • Data Bricks
  • StreamSet
  • AWS certifications and references of work on AWS migration projects would be favourable.

To find out more about the opportunity, or to apply, please contact Ash Ali at

All direct and third-party applications will be forwarded to our exclusive recruitment partner Robert Walters for their consideration.

  • Brighton
  • £400.00 - £550.00 per day
    • Contract
  • 10 Mar 2020
Data Scientist

Role Overview

Data scientists explore data using innovative statistical, analytical and machine learning approaches to provide actionable insights which will help create new, high-value customer propositions.

The data scientist will be responsible for creating and incrementally improving data products, focused on particular customer outcomes, during the early discovery and design phases. Being a key member of an agile product team, the junior data scientist will closely collaborate with business domain experts to understand raw data and its business context then apply data science techniques to develop valuable, actionable insights, algorithms or models which will be tested with customers.

The data scientist will be comfortable in applying data science techniques such as data mining, statistical and cluster analysis to large data sets using cloud computing resources, Big Data frameworks and open-source packages. They will understand both the theory and have practical experience building accurate predictive models using a range of machine learning algorithms (e.g. regression, decision tree, k-means).

The successful candidate will be able to effectively communicate complex, technical information to non-technical stakeholders. Where necessary the role would include on-the-job training and coaching from experienced data scientists in the team and begin a development path to becoming a future technical leader.

Responsibilities:

*Set up an exploratory environment, join multiple large data sets (weather, demographic, sensor) to enable an in depth understanding of energy use in 100'000's homes
*Generate statistics, clusters and insights about the customer group to help direct future data enabled products and services
*Help prioritise the products and services and use data to support these decisions
*Build models, algorithms or novel approaches to automating data enabled products and services for customers, track the impact of these services to support iterative development
*Build relationships with the wider data science community in BlueLab and EDF Energy
*Support the industrialisation of models and algorithms from R&D

Principal Accountabilities:

*Apply scientific methods through experimental design, exploratory data analysis and hypothesis testing
*Lead on the technical solutions for proof-of-concept projects, experiments and trials using advanced analytics and machine learning
*Understand the business drivers of use cases and helping to confirm the appropriateness of data science techniques in developing solutions
*Scrape, explore, cleanse and visualise big datasets using leading tools and technologies
*Engineer new features for models from raw data
*Demonstrate best practice programming and code optimisation using Python
*Develop predictive and prescriptive models which utilise supervised or unsupervised machine learning algorithms
*Deliver actionable insights which provide positive commercial impact to the wider Customers Business Unit
*Document and communicate findings from analysis to key stakeholders within the organisation
*Contribute to the wider Customer Data Science community by sharing knowledge, experience ideas and findings
*Continually develop technical capabilities through on-the-job coaching and mentorship from more experienced data scientists in the team
*Drive the Trust agenda forward by ensuring that all decisions affecting customers meet the Trust Test, deliver the desired customer outcomes and provide customers with the ability to make informed choices

Knowledge, Skills, Qualifications & Experience:

Essential:
*Strong understanding of applied maths, statistics, data mining techniques and algorithms
*Good programming knowledge in SQL, Python or other advanced programming languages
*Experience with Big Data framework such as Hadoop or Spark
*Highly experienced with cloud computing environments (AWS)
*Familiarity with Jupyter notebooks
*Knowledge of schema, technical tables and application logic for SAP ERP systems (particularly SAP CRM and SAP ISU)
*Skilled in extracting, transforming, wrangling and cleansing large datasets for the purposes of data analysis
*Proven work experience in designing and building accurate predictive and prescriptive models
*Skilled in applying appropriate machine learning algorithms across a range of problems including both structured and unstructured data
*Skilled in creating compelling data visualisation to communicate complex analytical findings
*Ability to inform and influence, and to drive awareness of the benefits of data analytics
*Customer mindset with focus on delivering high quality analysis on time
*Able to effectively communicate complex, technical information to non-technical stakeholders
*Able to proactively identify opportunities for data analysis to create value
*Analytical mind-set with positive attitude to problem solving
*Strong written and verbal communication skills
*Demonstrated ability and flexibility to work in a fast paced and demanding environment
*Transparent and open in their approach and willing to continually improve through acting on constructive feedback
Critical thinker who is willing to use evidence to challenge prevailing systems and processes
*Experience and knowledge of key functions of an energy supplier (e.g. customer services, digital channels, sales & marketing, debt recovery, commercial strategy or finance)

Desirable:
*Highly experienced with cloud computing environments (AWS)
*Experience in cutting edge analytical techniques including deep learning algorithms, neural networks and natural language processing
*Ability to implement novel analytical techniques
*Experience of agile working practices and managing agile projects (Scrum, Kanban)
*Open minded and flexible approach to problem solving - questions currently accepted approaches

Essential qualifications & Experience:
*An academic background in statistics, applied mathematics, computer science, engineering or physical sciences

  • Brighton
  • £400.00 - £550.00 per day
    • Contract
  • 10 Mar 2020
Data Scientist

Role Overview

Data scientists explore data using innovative statistical, analytical and machine learning approaches to provide actionable insights which will help create new, high-value customer propositions.

The data scientist will be responsible for creating and incrementally improving data products, focused on particular customer outcomes, during the early discovery and design phases. Being a key member of an agile product team, the junior data scientist will closely collaborate with business domain experts to understand raw data and its business context then apply data science techniques to develop valuable, actionable insights, algorithms or models which will be tested with customers.

The data scientist will be comfortable in applying data science techniques such as data mining, statistical and cluster analysis to large data sets using cloud computing resources, Big Data frameworks and open-source packages. They will understand both the theory and have practical experience building accurate predictive models using a range of machine learning algorithms (e.g. regression, decision tree, k-means).

The successful candidate will be able to effectively communicate complex, technical information to non-technical stakeholders. Where necessary the role would include on-the-job training and coaching from experienced data scientists in the team and begin a development path to becoming a future technical leader.

Responsibilities:

*Set up an exploratory environment, join multiple large data sets (weather, demographic, sensor) to enable an in depth understanding of energy use in 100'000's homes
*Generate statistics, clusters and insights about the customer group to help direct future data enabled products and services
*Help prioritise the products and services and use data to support these decisions
*Build models, algorithms or novel approaches to automating data enabled products and services for customers, track the impact of these services to support iterative development
*Build relationships with the wider data science community in BlueLab and EDF Energy
*Support the industrialisation of models and algorithms from R&D

Principal Accountabilities:

*Apply scientific methods through experimental design, exploratory data analysis and hypothesis testing
*Lead on the technical solutions for proof-of-concept projects, experiments and trials using advanced analytics and machine learning
*Understand the business drivers of use cases and helping to confirm the appropriateness of data science techniques in developing solutions
*Scrape, explore, cleanse and visualise big datasets using leading tools and technologies
*Engineer new features for models from raw data
*Demonstrate best practice programming and code optimisation using Python
*Develop predictive and prescriptive models which utilise supervised or unsupervised machine learning algorithms
*Deliver actionable insights which provide positive commercial impact to the wider Customers Business Unit
*Document and communicate findings from analysis to key stakeholders within the organisation
*Contribute to the wider Customer Data Science community by sharing knowledge, experience ideas and findings
*Continually develop technical capabilities through on-the-job coaching and mentorship from more experienced data scientists in the team
*Drive the Trust agenda forward by ensuring that all decisions affecting customers meet the Trust Test, deliver the desired customer outcomes and provide customers with the ability to make informed choices

Knowledge, Skills, Qualifications & Experience:

Essential:
*Strong understanding of applied maths, statistics, data mining techniques and algorithms
*Good programming knowledge in SQL, Python or other advanced programming languages
*Experience with Big Data framework such as Hadoop or Spark
*Highly experienced with cloud computing environments (AWS)
*Familiarity with Jupyter notebooks
*Knowledge of schema, technical tables and application logic for SAP ERP systems (particularly SAP CRM and SAP ISU)
*Skilled in extracting, transforming, wrangling and cleansing large datasets for the purposes of data analysis
*Proven work experience in designing and building accurate predictive and prescriptive models
*Skilled in applying appropriate machine learning algorithms across a range of problems including both structured and unstructured data
*Skilled in creating compelling data visualisation to communicate complex analytical findings
*Ability to inform and influence, and to drive awareness of the benefits of data analytics
*Customer mindset with focus on delivering high quality analysis on time
*Able to effectively communicate complex, technical information to non-technical stakeholders
*Able to proactively identify opportunities for data analysis to create value
*Analytical mind-set with positive attitude to problem solving
*Strong written and verbal communication skills
*Demonstrated ability and flexibility to work in a fast paced and demanding environment
*Transparent and open in their approach and willing to continually improve through acting on constructive feedback
Critical thinker who is willing to use evidence to challenge prevailing systems and processes
*Experience and knowledge of key functions of an energy supplier (e.g. customer services, digital channels, sales & marketing, debt recovery, commercial strategy or finance)

Desirable:
*Highly experienced with cloud computing environments (AWS)
*Experience in cutting edge analytical techniques including deep learning algorithms, neural networks and natural language processing
*Ability to implement novel analytical techniques
*Experience of agile working practices and managing agile projects (Scrum, Kanban)
*Open minded and flexible approach to problem solving - questions currently accepted approaches

Essential qualifications & Experience:
*An academic background in statistics, applied mathematics, computer science, engineering or physical sciences

  • London
  • £120,000 per annum
    • Permanent
  • 10 Mar 2020
Channel Manager UK
Salary: Up to £200k Package (60/40 split)
Location: UK remote
Using AI and BI, this client’s technology combines data science, modern machine learning techniques and behavioural analysis to analyse business data points as they try to infiltrate into a company’s network.
Due to growth and recent funding, this client will soon be looking for a highly ambitious and driven Channel Manager UK to join their exciting and growing team.

As their Channel Manager UK, you will be tasked with establishing and developing key strategic partnership with customers, channel and technology partners.

Essentials:

- At least 5+ years’ Big Data channel sales experience
- Extensive experience working in the channel through partners, SIs, distributors and VARs.
- Experience both in the UK
- Proven track record of overachieving sales targets
- Willingness to travel to assigned territories

Benefits:

- Up to £120k base + £80K Base and additional benefits

 If you feel this is the right career move, reach out to me via email:  or call me on
  • London
  • £120,000 per annum
    • Permanent
  • 10 Mar 2020
Channel Manager UK
Salary: Up to £200k Package (60/40 split)
Location: UK remote
Using AI and BI, this client’s technology combines data science, modern machine learning techniques and behavioural analysis to analyse business data points as they try to infiltrate into a company’s network.
Due to growth and recent funding, this client will soon be looking for a highly ambitious and driven Channel Manager UK to join their exciting and growing team.

As their Channel Manager UK, you will be tasked with establishing and developing key strategic partnership with customers, channel and technology partners.

Essentials:

- At least 5+ years’ Big Data channel sales experience
- Extensive experience working in the channel through partners, SIs, distributors and VARs.
- Experience both in the UK
- Proven track record of overachieving sales targets
- Willingness to travel to assigned territories

Benefits:

- Up to £120k base + £80K Base and additional benefits

 If you feel this is the right career move, reach out to me via email:  or call me on
  • Hove
  • £450.00 - £500.00 per day
    • Contract
  • 10 Mar 2020

Position: Data Modelling Lead

Location: Brighton

Contract: £500 a day

6-month contract + (Perm option Available)

The Data Modelling Lead needed to help develop the Data Architecture operating model, set the strategic direction, design the capability model and lead the implementation plan across my client’s customer division. You will also design and implement strategies to enhance the value of their existing assets and drive best practice across on premise and newly emerging data lakes.

What you will be doing:

  • Contribute to the delivery of the new operating model to enhance Data Architecture capabilities across cloud and on-premise platforms.
  • Contribute to the delivery of the plan to drive adoption of assets such as Business Glossary, Data Dictionary and demonstrating value and overcome cultural barriers to Data Management best practice.
  • Contribute to the target design, policies and standards, working proactively to maintain a stable, viable data architecture and ensure consistency of data across the business.
  • Enforce principles of good data design across new platforms, data lakes and changes to existing source systems.
  • Updating the corporate and subject oriented data models to facilitate delivery of data for whatever purpose to the business; including an enterprise data model.
  • Updating the corporate Business Glossary, Data Dictionary and Data Catalogue artefacts in line with the Best Practice defined within the Data Management team.
  • Work with the Data Architecture Lead to ensure that all new and existing data models are updated to reflect ongoing changes to business, data and applications architecture.

Key skills required:

  • Stakeholder buy-in at senior level
  • Able to bring stakeholders along on the journey
  • Practical hands on modelling experience.
  • Seniority level -mid level -3+ years
  • Ideally would have experience with on premise and Cloud Architecture
  • Embed themselves in IT projects, update data modules.
  • Experience of AWS-glue, Big Data modelling tools eg. Mega.
  • Tools - Sybase power designer, IBM data architect and or Mega.
  • Technical and functional - able to communicate with non-technical audiences.
  • Someone who is able to go out into the business, engage with Stakeholders and drive business adoptions.
  • Need to help deliver an IT project, migrate all modules to a new piece of software.
  • Looking for someone from Data Architectural, BA type background.
  • Someone who is comfortable talking through business side, looking at the business logical modelling.
  • Hove
  • £450.00 - £500.00 per day
    • Contract
  • 10 Mar 2020

Position: Data Modelling Lead

Location: Brighton

Contract: £500 a day

6-month contract + (Perm option Available)

The Data Modelling Lead needed to help develop the Data Architecture operating model, set the strategic direction, design the capability model and lead the implementation plan across my client’s customer division. You will also design and implement strategies to enhance the value of their existing assets and drive best practice across on premise and newly emerging data lakes.

What you will be doing:

  • Contribute to the delivery of the new operating model to enhance Data Architecture capabilities across cloud and on-premise platforms.
  • Contribute to the delivery of the plan to drive adoption of assets such as Business Glossary, Data Dictionary and demonstrating value and overcome cultural barriers to Data Management best practice.
  • Contribute to the target design, policies and standards, working proactively to maintain a stable, viable data architecture and ensure consistency of data across the business.
  • Enforce principles of good data design across new platforms, data lakes and changes to existing source systems.
  • Updating the corporate and subject oriented data models to facilitate delivery of data for whatever purpose to the business; including an enterprise data model.
  • Updating the corporate Business Glossary, Data Dictionary and Data Catalogue artefacts in line with the Best Practice defined within the Data Management team.
  • Work with the Data Architecture Lead to ensure that all new and existing data models are updated to reflect ongoing changes to business, data and applications architecture.

Key skills required:

  • Stakeholder buy-in at senior level
  • Able to bring stakeholders along on the journey
  • Practical hands on modelling experience.
  • Seniority level -mid level -3+ years
  • Ideally would have experience with on premise and Cloud Architecture
  • Embed themselves in IT projects, update data modules.
  • Experience of AWS-glue, Big Data modelling tools eg. Mega.
  • Tools - Sybase power designer, IBM data architect and or Mega.
  • Technical and functional - able to communicate with non-technical audiences.
  • Someone who is able to go out into the business, engage with Stakeholders and drive business adoptions.
  • Need to help deliver an IT project, migrate all modules to a new piece of software.
  • Looking for someone from Data Architectural, BA type background.
  • Someone who is comfortable talking through business side, looking at the business logical modelling.
  • London
  • £450.00 - £500.00 per day
    • Contract
  • 10 Mar 2020

Data Engineer

My global financial client are looking for a Data Engineer to help deliver an ecosystem of enriched and protected sets of data - created from raw, structured and unstructured sources. My client has over 300 sources and a rapidly growing book of work. We are utilising the latest technologies to solve business problems and deliver value and truly unique insights. They are looking for Data Engineers that will work on the collecting, storing, processing, and analysing of large sets of data.

The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company and to help build out some core services that power Machine Learning and analytics systems.

Key Responsibilities: Data Engineer:

* Ability to process and rationalize structured data, message data and semi/unstructured data and ability to integrate multiple large data sources and databases into one system

* Proficient understanding of distributed computing principles and of the fundamental design principles behind a scalable application

* Strong knowledge of the Big Data eco system, experience with Hortonworks/Cloudera platforms

* Practical experience in using HDFS

* Practical expertise in developing applications and using querying tools on top of Hive, Spark (PySpark)

* Strong Scala skills

* Experience in Python, particularly the Anaconda environment and Python based ML model deployment

* Experience of Continuous Integration/Continuous Deployment (Jenkins/Hudson/Ansible)

* Experience with using GIT/GITLAB as a version control system.

Nice to Haves

* Knowledge of at least one Python web framework (preferably: Flask, Tornado, and/or twisted)

* Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 would be a plus

* Good understanding of global markets, markets macrostructure and macro economics

* Knowledge of Elastic Search Stack (ELK)

* Experience with Google Cloud Platform (Data Proc / Dataflow)

Domain Knowledge:

* Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context.

* Knowledge of Trade Finance or Securities Services particularly useful.

* Knowledge of one or more of the following domains (including market data vendors):

* Party/Client

* Trade

* Settlements

* Payments

* Instrument and pricing

* Market and/or Credit Risk Need Experience using below languages/tools:

* Java

* HQL, SQL

* Querying tools on top of Hive, Spark (PySpark)

* Scala

* Python, particularly the Anaconda environment

* GIT/GITLAB as a version control system

  • London
  • £450.00 - £500.00 per day
    • Contract
  • 10 Mar 2020

Data Engineer

My global financial client are looking for a Data Engineer to help deliver an ecosystem of enriched and protected sets of data - created from raw, structured and unstructured sources. My client has over 300 sources and a rapidly growing book of work. We are utilising the latest technologies to solve business problems and deliver value and truly unique insights. They are looking for Data Engineers that will work on the collecting, storing, processing, and analysing of large sets of data.

The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company and to help build out some core services that power Machine Learning and analytics systems.

Key Responsibilities: Data Engineer:

* Ability to process and rationalize structured data, message data and semi/unstructured data and ability to integrate multiple large data sources and databases into one system

* Proficient understanding of distributed computing principles and of the fundamental design principles behind a scalable application

* Strong knowledge of the Big Data eco system, experience with Hortonworks/Cloudera platforms

* Practical experience in using HDFS

* Practical expertise in developing applications and using querying tools on top of Hive, Spark (PySpark)

* Strong Scala skills

* Experience in Python, particularly the Anaconda environment and Python based ML model deployment

* Experience of Continuous Integration/Continuous Deployment (Jenkins/Hudson/Ansible)

* Experience with using GIT/GITLAB as a version control system.

Nice to Haves

* Knowledge of at least one Python web framework (preferably: Flask, Tornado, and/or twisted)

* Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 would be a plus

* Good understanding of global markets, markets macrostructure and macro economics

* Knowledge of Elastic Search Stack (ELK)

* Experience with Google Cloud Platform (Data Proc / Dataflow)

Domain Knowledge:

* Knowledge of and experience using data models and data dictionaries in a Banking and Financial Markets context.

* Knowledge of Trade Finance or Securities Services particularly useful.

* Knowledge of one or more of the following domains (including market data vendors):

* Party/Client

* Trade

* Settlements

* Payments

* Instrument and pricing

* Market and/or Credit Risk Need Experience using below languages/tools:

* Java

* HQL, SQL

* Querying tools on top of Hive, Spark (PySpark)

* Scala

* Python, particularly the Anaconda environment

* GIT/GITLAB as a version control system