REDEX empowers corporations and individuals around the globe the ability to go green and do their part in limiting the environment from further climate changes.
They can help to support and contribute to the more sustainable and renewable energy sources, through the purchase of renewable energy certificates (RECs) using our reliable trading platform powered by cutting-edge blockchain technology and artificial intelligence.
REDEX provides a TRUSTED end-to-end service for our clients, from RECs qualified by the global registry, buyers-sellers fast matching, post-sale ownership verification, and easy RECs retirement to trading fraud prevention.
Responsibilities:
- Implementation of new energy projects involving ingestion of data and analysis of energy usage and RECs transaction techniques.
- Build data pipelines to bring together information from different data sources.
- Lead the engineering and data modeling work and guide other engineers on performing the work.
- Perform data cleansing and structure content for use in analytics applications.
- Create data models that will allow intuitive data visualization and analysis.
- Design algorithms to innovatively utilize RECs to bring into realization of virtual battery concept.
- Forecast data patterns using artificial intelligence and machine learning techniques.
- Detect fraud transaction through supervised and unsupervised machine learning techniques.
Main Tasks:
- Implement data engineering work include ingestion of data points, analysis of data for energy sector domain.
- Implement data lake using Azure Cloud technologies such as Data Factory and Databricks.
- Design and implement new software products following the agreed design patterns and agile methodology
- Technical responsibility for specific areas of software solutions
- Creation and maintenance of documentation for the developed software
- Performance of code reviews as part of the development process
- Unit testing for all developed features
- Utilize large language model and generative AI tools such as ChatGPT, Bard to further optimize business workflow.
Requirements:
-Bachelor’s degree in computer programming, computer science, mathematics or a related field.
- More than 1 year experience in Big Data Ingestion and engineering work
- Experience with artificial intelligence or machine learning technology application.
- Fluent in one of these programming languages such as Python, C#
- Fluent with Data engineering work.
- Experience in using open source big data tools and Azure cloud Data Engineering technologies.
- Ability to work independently or with a group.
- Experience in building REST-oriented APIs.
** Nice to have:
- Domain experience in working with blockchain technologies.
- TensorFlow, Hadoop, Azure Machine Learning
- ETL tools, Azure Databrick, Azure Data Factory
- Related Certification such as Microsoft Certified: Azure Data Engineer Associate, Azure AI Engineer Associate, Azure Data Scientist Associate
Benefits:
Hybrid Working Environment – 40% work from home
Performance Bonus