keywords/skills

Showing page 5 of 26 (10 in 0.68 seconds)

  • City Of London
  • £60,000 - £80,000 per annum
    • Permanent
  • 10 Mar 2020

I am engaged with a leading organisation in the Insurance space who are looking to recruit experienced Data Engineers into their growing 'Cloud & Data' function, based out of their HQ in London (5 minutes walk from the Gherkin).

The organisation are going through a huge recruitment drive in 2020, focusing on improving the way they utilise data and you'll be pleased to know that unlike a lot of non-tech companies, they view technology as an enabler rather than a blocker.

The roles in question are focused primarily on the design and development of a strategic analytical platform (AWS) to enable integration, analysis and insight of a variety of data sets, as well as the development of data and analytics solutions for machine learning and data visualisation. Moreover, you will be expected to develop and manage relationships across data, technology and business functions (think stakeholder management), challenging existing paradigms in a constructive manner that demonstrate and promote the value of a mature data analytics capability.

Fundamentally, you will be given the opportunity to drive a data-first mentality within the organisation, a unique and rewarding challenge that is usually quite rare to find in established organisation such as this one.

In terms of their expectations… if you can demonstrate some of the below skills then you stand a good chance of being shortlisted to the interview stage:

  • Excellent understanding of Data Architecture, including both on-premise and via the AWS Cloud.
  • Strong experience with Data Modelling and Data Analysis, including the ability to query, prepare and analyse Data sets.
  • Proven commercial experience with both SQL and NoSQL databases.
  • Skilled in software engineering, with experience in some (not necessarily all) of the following programming languages: Python, R, Java, Scala.
  • Familiarity with Linux/UNIX and tools including Git and SVN.
  • Good understanding of Data Science models and Data Modelling in general (e.g. Star Schema, Kimball).
  • Some commercial experience with technologies in the Big Data space (e.g. Cassandra, Spark, Kafka, Hadoop, Kibana etc.).

In return, you will get the opportunity to work on cutting-edge projects in an Agile environment.

Moreover, I have personally worked with these guys for years and they have an excellent company culture, focusing on tailored personal development, genuine career progression and flexibility, whether it be home working opportunities or simply flexible working hours. They also offer a generous bonus/benefits package as well as the chance to get huge discounts on car, travel and home insurance.

So if you like the sound of the role and would like to find out more then please contact me via the following:

  • Mobile:
  • Office Line:
  • Email:
  • LinkedIn: 'Liam Haghighat'

Anything discussed will remain completely confidential and fully compliant with GDPR.

  • City Of London
  • £60,000 - £80,000 per annum
    • Permanent
  • 10 Mar 2020

I am engaged with a leading organisation in the Insurance space who are looking to recruit experienced Data Engineers into their growing 'Cloud & Data' function, based out of their HQ in London (5 minutes walk from the Gherkin).

The organisation are going through a huge recruitment drive in 2020, focusing on improving the way they utilise data and you'll be pleased to know that unlike a lot of non-tech companies, they view technology as an enabler rather than a blocker.

The roles in question are focused primarily on the design and development of a strategic analytical platform (AWS) to enable integration, analysis and insight of a variety of data sets, as well as the development of data and analytics solutions for machine learning and data visualisation. Moreover, you will be expected to develop and manage relationships across data, technology and business functions (think stakeholder management), challenging existing paradigms in a constructive manner that demonstrate and promote the value of a mature data analytics capability.

Fundamentally, you will be given the opportunity to drive a data-first mentality within the organisation, a unique and rewarding challenge that is usually quite rare to find in established organisation such as this one.

In terms of their expectations… if you can demonstrate some of the below skills then you stand a good chance of being shortlisted to the interview stage:

  • Excellent understanding of Data Architecture, including both on-premise and via the AWS Cloud.
  • Strong experience with Data Modelling and Data Analysis, including the ability to query, prepare and analyse Data sets.
  • Proven commercial experience with both SQL and NoSQL databases.
  • Skilled in software engineering, with experience in some (not necessarily all) of the following programming languages: Python, R, Java, Scala.
  • Familiarity with Linux/UNIX and tools including Git and SVN.
  • Good understanding of Data Science models and Data Modelling in general (e.g. Star Schema, Kimball).
  • Some commercial experience with technologies in the Big Data space (e.g. Cassandra, Spark, Kafka, Hadoop, Kibana etc.).

In return, you will get the opportunity to work on cutting-edge projects in an Agile environment.

Moreover, I have personally worked with these guys for years and they have an excellent company culture, focusing on tailored personal development, genuine career progression and flexibility, whether it be home working opportunities or simply flexible working hours. They also offer a generous bonus/benefits package as well as the chance to get huge discounts on car, travel and home insurance.

So if you like the sound of the role and would like to find out more then please contact me via the following:

  • Mobile:
  • Office Line:
  • Email:
  • LinkedIn: 'Liam Haghighat'

Anything discussed will remain completely confidential and fully compliant with GDPR.

  • Peterborough
  • £60,000 - £75,000 per annum, inc benefits
    • Permanent
  • 10 Mar 2020

Global Manufacturer require a Data Engineer to work with developers, database architects / data scientists on strategic data initiatives. You will be responsible for expanding and optimizing data and pipeline architecture, as well as data flow/collection for cross functional teams. Uou will be an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

Client Details

Global Manufacturer / Distributor

Description

Global Manufacturer are looking for a Data Engineer to work with software developers, database architects, data analysts and data scientists on strategic data initiatives. The Data Engineer will be responsible for expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

As Data Engineer you will be responsible for:

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL/Non-SQL and Azure 'big data' technologies
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition
  • operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centres and Azure regions
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • For developing, constructing, maintenance and testing of data architectures, such as databases and large-scale processing systems
  • Clean, aggregate, and organize data from disparate sources and transfer it to data storage for analysis. Making sure the companies data technology is operating at its peak results in massive improvements to performance, cost, or both
  • Owning the stability of new products designed, including the on-going robustness, resilience and stability of these products
  • Building and maintaining custom ingestion pipelines

Experienced using the following software/tools:

  • Big data tools: Hadoop, Spark, Kafka, etc.
  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc
  • AWS cloud services: EC2, EMR, RDS, Redshift
  • Stream-processing systems: Storm, Spark-Streaming
  • Object-oriented/object function scripting languages: Python, Java, C++, Scala
  • Data Integration / Archiving / Relational Databases and Data Warehousing
  • MS SQL, Logic Apps, Function Apps, SSIS, SSRS
  • Microsoft Azure DevOps
  • Data Visualisation

Profile

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing 'big data' data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
  • Strong project management and organizational skills.

Experienced using the following software/tools:

  • Big data tools: Hadoop, Spark, Kafka
  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow
  • AWS cloud services: EC2, EMR, RDS, Redshift
  • Stream-processing systems: Storm, Spark-Streaming
  • Object-oriented/object function scripting languages: Python, Java, C++, Scala
  • Data Integration / Archiving / Relational Databases and Data Warehousing
  • MS SQL, Logic Apps, Function Apps, SSIS, SSRS
  • Microsoft Azure DevOps
  • Data Visualisation

Job Offer

Opportunity to join a rapidly expanding team

Opportunity to work in an advanced analytics function

  • Peterborough
  • £60,000 - £75,000 per annum, inc benefits
    • Permanent
  • 10 Mar 2020

Global Manufacturer require a Data Engineer to work with developers, database architects / data scientists on strategic data initiatives. You will be responsible for expanding and optimizing data and pipeline architecture, as well as data flow/collection for cross functional teams. Uou will be an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

Client Details

Global Manufacturer / Distributor

Description

Global Manufacturer are looking for a Data Engineer to work with software developers, database architects, data analysts and data scientists on strategic data initiatives. The Data Engineer will be responsible for expanding and optimizing data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up.

As Data Engineer you will be responsible for:

  • Create and maintain optimal data pipeline architecture,
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL/Non-SQL and Azure 'big data' technologies
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition
  • operational efficiency and other key business performance metrics
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs
  • Keep our data separated and secure across national boundaries through multiple data centres and Azure regions
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
  • For developing, constructing, maintenance and testing of data architectures, such as databases and large-scale processing systems
  • Clean, aggregate, and organize data from disparate sources and transfer it to data storage for analysis. Making sure the companies data technology is operating at its peak results in massive improvements to performance, cost, or both
  • Owning the stability of new products designed, including the on-going robustness, resilience and stability of these products
  • Building and maintaining custom ingestion pipelines

Experienced using the following software/tools:

  • Big data tools: Hadoop, Spark, Kafka, etc.
  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc
  • AWS cloud services: EC2, EMR, RDS, Redshift
  • Stream-processing systems: Storm, Spark-Streaming
  • Object-oriented/object function scripting languages: Python, Java, C++, Scala
  • Data Integration / Archiving / Relational Databases and Data Warehousing
  • MS SQL, Logic Apps, Function Apps, SSIS, SSRS
  • Microsoft Azure DevOps
  • Data Visualisation

Profile

  • Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field.
  • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
  • Experience building and optimizing 'big data' data pipelines, architectures and data sets.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Working knowledge of message queuing, stream processing, and highly scalable 'big data' data stores.
  • Strong project management and organizational skills.

Experienced using the following software/tools:

  • Big data tools: Hadoop, Spark, Kafka
  • Relational SQL and NoSQL databases, including Postgres and Cassandra
  • Data pipeline and workflow management tools: Azkaban, Luigi, Airflow
  • AWS cloud services: EC2, EMR, RDS, Redshift
  • Stream-processing systems: Storm, Spark-Streaming
  • Object-oriented/object function scripting languages: Python, Java, C++, Scala
  • Data Integration / Archiving / Relational Databases and Data Warehousing
  • MS SQL, Logic Apps, Function Apps, SSIS, SSRS
  • Microsoft Azure DevOps
  • Data Visualisation

Job Offer

Opportunity to join a rapidly expanding team

Opportunity to work in an advanced analytics function

  • Hampshire
  • £60,000 - £70,000 per annum
    • Permanent
  • 10 Mar 2020

Senior / Lead C# Developer

C#, Web API, CI/CD

£60,000 - £70,000 + 25% Bonus and Excellent Benefits

Would you like to work in one of the most exciting and innovative development teams in Hampshire?
If you, like them, strive for innovation and continuous improvement, crave collaborative working, thrive working with new product development and want to work with the best people in the industry then this is the place for you.

We have an exciting new opportunity for a Senior / Lead C# Developer to join one of the most dynamic and talented software teams in the South. Boasting state of the art technology, huge investment in R&D and a greenfield development environmentwe can honestly say it doesn't get much better than this.

You will spend your time designing and developing solutions used by over 1 million members building solutions powered by team city, octopus and Azure and deploying a new version of the site up to 6 times a day.

The company is operating the following tech environment, you will need to be well versed in the majority of the following:

  • C#
  • ASP.Net 4+
  • Web API
  • Mobile API
  • CI/CD
  • MS SQL, NoSQL, Mongo/Hadoop
  • Good design patterns and architectural skills - SOLID / OOP
  • Good real work experience of testing
  • Bonus skills - PaaS, IaaS, Azure, Devops

The Benefits

  • Modern Azure based technology stack
  • Dedicated innovation time
  • Competitive salary
  • Bonus scheme - 25%
  • Flexible working - 2 days per week remote
  • Private medical and life insurance
  • Company contribution pension scheme

The interview process is friendly and engaging, starting with a telephone conversation aimed at giving you great insight in to the company, team, products and culture of the business. From this point on you will be be totally sold on the opportunity and excited about the prospect of working here.

Please apply now or contact for more info

  • Hampshire
  • £60,000 - £70,000 per annum
    • Permanent
  • 10 Mar 2020

Senior / Lead C# Developer

C#, Web API, CI/CD

£60,000 - £70,000 + 25% Bonus and Excellent Benefits

Would you like to work in one of the most exciting and innovative development teams in Hampshire?
If you, like them, strive for innovation and continuous improvement, crave collaborative working, thrive working with new product development and want to work with the best people in the industry then this is the place for you.

We have an exciting new opportunity for a Senior / Lead C# Developer to join one of the most dynamic and talented software teams in the South. Boasting state of the art technology, huge investment in R&D and a greenfield development environmentwe can honestly say it doesn't get much better than this.

You will spend your time designing and developing solutions used by over 1 million members building solutions powered by team city, octopus and Azure and deploying a new version of the site up to 6 times a day.

The company is operating the following tech environment, you will need to be well versed in the majority of the following:

  • C#
  • ASP.Net 4+
  • Web API
  • Mobile API
  • CI/CD
  • MS SQL, NoSQL, Mongo/Hadoop
  • Good design patterns and architectural skills - SOLID / OOP
  • Good real work experience of testing
  • Bonus skills - PaaS, IaaS, Azure, Devops

The Benefits

  • Modern Azure based technology stack
  • Dedicated innovation time
  • Competitive salary
  • Bonus scheme - 25%
  • Flexible working - 2 days per week remote
  • Private medical and life insurance
  • Company contribution pension scheme

The interview process is friendly and engaging, starting with a telephone conversation aimed at giving you great insight in to the company, team, products and culture of the business. From this point on you will be be totally sold on the opportunity and excited about the prospect of working here.

Please apply now or contact for more info

  • Coventry
  • Salary negotiable
    • Temp
  • 10 Mar 2020

Job Title: Contract - Data Migration Engineer

Day Rate: £Negotiable (Inside IR35)

Team: Technical Operations

Office: Coventry

Duration: 6-12 Months

Shell Energy has partnered exclusively with Robert Walters to find an experienced Contract Data Migration Engineer in Coventry supporting on a migration project.

About Shell Energy

At Shell Energy, we provide more and cleaner energy solutions across a global portfolio ofgas,powerandenvironmental productsto meet current and future energy needs of our customers: energy producers, asset owners, traders, wholesalers and large commercial and industrial energy users.

Your part in our journey:

You will be working closely with our internal customers to ensure well architected cloud services are delivered that meet fast changing demands.

This is working on a greenfield migration project.

Skills

  • Expert Level of experience of SQL Server
  • Cloud experience preferably AWS but open to people with experience with Azure and GCP.

Desirable

  • Experience with Big Data, Hadoop
  • Snowflake
  • Data Bricks
  • StreamSet
  • AWS certifications and references of work on AWS migration projects would be favourable.

To find out more about the opportunity, or to apply, please contact Ash Ali at

All direct and third-party applications will be forwarded to our exclusive recruitment partner Robert Walters for their consideration.

  • Coventry
  • Salary negotiable
    • Temp
  • 10 Mar 2020

Job Title: Contract - Data Migration Engineer

Day Rate: £Negotiable (Inside IR35)

Team: Technical Operations

Office: Coventry

Duration: 6-12 Months

Shell Energy has partnered exclusively with Robert Walters to find an experienced Contract Data Migration Engineer in Coventry supporting on a migration project.

About Shell Energy

At Shell Energy, we provide more and cleaner energy solutions across a global portfolio ofgas,powerandenvironmental productsto meet current and future energy needs of our customers: energy producers, asset owners, traders, wholesalers and large commercial and industrial energy users.

Your part in our journey:

You will be working closely with our internal customers to ensure well architected cloud services are delivered that meet fast changing demands.

This is working on a greenfield migration project.

Skills

  • Expert Level of experience of SQL Server
  • Cloud experience preferably AWS but open to people with experience with Azure and GCP.

Desirable

  • Experience with Big Data, Hadoop
  • Snowflake
  • Data Bricks
  • StreamSet
  • AWS certifications and references of work on AWS migration projects would be favourable.

To find out more about the opportunity, or to apply, please contact Ash Ali at

All direct and third-party applications will be forwarded to our exclusive recruitment partner Robert Walters for their consideration.

  • Surrey
  • Competitive salary
    • Contract
  • 10 Mar 2020

An exciting opportunity for a Talend Data Engineer to join a cyber security company and to work with their suppliers to translate business ideas into innovative data solutions and enterprise architecture. The contract is for 6 months, offering 2 days of remote work per week. This role is in outside IR35.

Key Talend Data Engineer specifications

  • 5 + years of Hadoop ecosystem project work
  • 5+ years of Data Governance and Data Modelling 
  • 3 + years of working in designing and developing ELT solutions
  • 3 + years of Snowflake 

Application process 

If you are interest in finding out more about the (contract) Talend Data Engineer role please contract the X4 group on . My client are seeking a start for mid April. 

  • Surrey
  • Competitive salary
    • Contract
  • 10 Mar 2020

An exciting opportunity for a Talend Data Engineer to join a cyber security company and to work with their suppliers to translate business ideas into innovative data solutions and enterprise architecture. The contract is for 6 months, offering 2 days of remote work per week. This role is in outside IR35.

Key Talend Data Engineer specifications

  • 5 + years of Hadoop ecosystem project work
  • 5+ years of Data Governance and Data Modelling 
  • 3 + years of working in designing and developing ELT solutions
  • 3 + years of Snowflake 

Application process 

If you are interest in finding out more about the (contract) Talend Data Engineer role please contract the X4 group on . My client are seeking a start for mid April.