
Data Engineer
at consulting & implementation services in the area of Oil & Gas, Mining and Manufacturing Industry


- Data Engineer
Required skill set: AWS GLUE, AWS LAMBDA, AWS SNS/SQS, AWS ATHENA, SPARK, SNOWFLAKE, PYTHON
Mandatory Requirements
- Experience in AWS Glue
- Experience in Apache Parquet
- Proficient in AWS S3 and data lake
- Knowledge of Snowflake
- Understanding of file-based ingestion best practices.
- Scripting language - Python & pyspark
CORE RESPONSIBILITIES
- Create and manage cloud resources in AWS
- Data ingestion from different data sources which exposes data using different technologies, such as: RDBMS, REST HTTP API, flat files, Streams, and Time series data based on various proprietary systems. Implement data ingestion and processing with the help of Big Data technologies
- Data processing/transformation using various technologies such as Spark and Cloud Services. You will need to understand your part of business logic and implement it using the language supported by the base data platform
- Develop automated data quality check to make sure right data enters the platform and verifying the results of the calculations
- Develop an infrastructure to collect, transform, combine and publish/distribute customer data.
- Define process improvement opportunities to optimize data collection, insights and displays.
- Ensure data and results are accessible, scalable, efficient, accurate, complete and flexible
- Identify and interpret trends and patterns from complex data sets
- Construct a framework utilizing data visualization tools and techniques to present consolidated analytical and actionable results to relevant stakeholders.
- Key participant in regular Scrum ceremonies with the agile teams
- Proficient at developing queries, writing reports and presenting findings
- Mentor junior members and bring best industry practices
QUALIFICATIONS
- 5-7+ years’ experience as data engineer in consumer finance or equivalent industry (consumer loans, collections, servicing, optional product, and insurance sales)
- Strong background in math, statistics, computer science, data science or related discipline
- Advanced knowledge one of language: Java, Scala, Python, C#
- Production experience with: HDFS, YARN, Hive, Spark, Kafka, Oozie / Airflow, Amazon Web Services (AWS), Docker / Kubernetes, Snowflake
- Proficient with
- Data mining/programming tools (e.g. SAS, SQL, R, Python)
- Database technologies (e.g. PostgreSQL, Redshift, Snowflake. and Greenplum)
- Data visualization (e.g. Tableau, Looker, MicroStrategy)
- Comfortable learning about and deploying new technologies and tools.
- Organizational skills and the ability to handle multiple projects and priorities simultaneously and meet established deadlines.
- Good written and oral communication skills and ability to present results to non-technical audiences
- Knowledge of business intelligence and analytical tools, technologies and techniques.
Familiarity and experience in the following is a plus:
- AWS certification
- Spark Streaming
- Kafka Streaming / Kafka Connect
- ELK Stack
- Cassandra / MongoDB
- CI/CD: Jenkins, GitLab, Jira, Confluence other related tools

Similar jobs

Job Requirements:
Intermediate Linux Knowledge
- Experience with shell scripting
- Familiarity with Linux commands such as grep, awk, sed
- Required
Advanced Python Scripting Knowledge
- Strong expertise in Python
- Required
Ruby
- Nice to have
Basic Knowledge of Network Protocols
- Understanding of TCP/UDP, Multicast/Unicast
- Required
Packet Captures
- Experience with tools like Wireshark, tcpdump, tshark
- Nice to have
High-Performance Messaging Libraries
- Familiarity with tools like Tibco, 29West, LBM, Aeron
- Nice to have
Job Summary:
We are looking for a highly skilled and experienced Java Backend Developer with 4.5+ years of hands-on experience in backend development, system design, and API integration. You’ll play a key role in designing and building high-performance backend systems that are scalable, secure, and reliable. As a core member of the engineering team, you’ll collaborate closely with product owners, architects, and other developers to deliver quality solutions.
Responsibilities:
- Design and develop backend components, REST APIs, and microservices using Java and Spring Boot.
- Contribute to the architectural decisions and system design discussions.
- Write clean, efficient, and testable code following best practices.
- Optimize application performance, scalability, and reliability.
- Integrate third-party APIs and work with external systems.
- Participate in code reviews, mentor junior developers, and support the team’s technical growth.
- Work closely with DevOps to support CI/CD pipelines and deployments.
- Troubleshoot and resolve complex issues in production and non-production environments.
- Keep up with industry trends and advocate for best practices in backend development.
Required Skills:
- 4.5+ years of hands-on experience in backend development with Java (Java 8 or above).
- Strong expertise in Spring Framework (Spring Boot, Spring MVC, Spring Data).
- Experience designing and consuming RESTful APIs and working with microservices architecture.
- Solid understanding of relational and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB).
- Proficient with version control systems like Git and tools like Maven or Gradle.
- Familiarity with Docker, containerization, and cloud platforms (AWS/GCP/Azure).
- Experience with unit testing and integration testing (JUnit, Mockito, etc.).
- Good knowledge of software design principles, data structures, and algorithms.
- Excellent problem-solving skills and attention to detail.
DevOps Engineer - remote, Full time
Coincrowd is an innovative Fintech company. We offer a crypto platform for seamless payments, Crypto Vouchers, crypto trading, portfolio management, real time market data, breaking news and powerful analytics. Please visit https://coincrowd.com/ for more information.
Domain : Finance, Blockchain, Crypto
Job Description:
We're seeking a detail-oriented and proactive DevOps Engineer who has a strong background in Google Cloud Platform (GCP) environments. The ideal candidate will be comfortable operating in a fast-paced, dynamic startup environment, where they will have the opportunity to make substantial contributions.
Key Responsibilities:
- Develop, test, and maintain infrastructure on GCP.
- Automate infrastructure, application deployment, scaling, and management using Kubernetes and other similar tools.
- Collaborate with our software development team to ensure seamless deployment of software updates and enhancements.
- Monitor system performance and troubleshoot issues.
- Ensure high levels of performance, availability, sustainability, and security.
- Implement DevOps best practices, such as IAC (Infrastructure as Code).
Qualifications:
- Proven experience as a DevOps Engineer or similar role in software development and system administration.
- Strong experience with GCP (Google Cloud Platform), including Compute Engine, Cloud Functions, Cloud Storage, and other relevant GCP services.
- Knowledge of Kubernetes, Docker, Jenkins, or similar technologies.
- Familiarity with network protocols, firewalls, and VPN.
- Experience with scripting languages such as Python, Bash, etc.
- Understanding of Infrastructure as Code (IAC) tools, like Terraform or CloudFormation.
- Excellent problem-solving skills, attention to detail, and ability to work in a team.
What We Offer:
In recognition of your valuable contributions, you will receive an equity-based compensation package. Join our dynamic and innovative team in the rapidly evolving fintech industry and play a key role in shaping the future of Coincrowd's success.
If you're ready to be at the forefront of the Payment Technology revolution and have the vision and experience to drive sales growth in the crypto space, please join us in our mission to redefine fintech at Coincrowd.

Skills & Experience:
- 3 to 10 Years of experience in Ruby on Rails
- Knowledge on Active Admin
- Experience using RSpec
- worked on various third-party integrations.
- Knowledge on AWS/Azure would be a plus
- Hands on AngularJs/ReactJs/NodeJS with AWS Exposure is a Plus.
EnKash is a leading corporate spends management and Payments Company. EnKash’s platform helps
businesses manage their payables, receivables, corporate cards & expenses. Businesses can manage
& control their spends in a completely DIY environment. EnKash serves both, financial institutions
and businesses with an objective to bring operational efficiencies and savings in business payments
flow.
EnKash is a leader in value creation for businesses, especially SMBs. Over the past four years EnKash
has delivered savings to more than 65000 users by digitizing their processes. The best part is EnKash
works as a layer on businesses’ existing softwares and banking relationships, building state of the art
experience and accessibility. This is combined with the ease of access to credit, purpose-based cards,
end use monitoring, customisable approval workflows and integrations with leading banks and
accounting platforms, is set to ease the journey for our SMBs.
The management and founding team at EnKash come with 100+ years of experience in paymentsbanking domain and have been instrumental in bringing many firsts to the ecosystem. EnKash is
based out of India with its offices at Mumbai, NCR, Pune and Bengaluru.
While being a leader in Indian ecosystem, EnKash often gets compared with global peers like Ramp,
Spendesk, Pleo, Payhawk, Soldo, Lithic, Marqeta, bill.com and a combination of their compelling
business models.
EnKash is a series B funded company with some marquee investors on board.
Feathers in the cap:
· Winner: India Fintech Awards 2020 by NASSCOM
· Best B2B Solution provider of the year, 2020 at Payments & Cards Summit.
Responsibilities:
- Manage each project's scope and timeline
- Coordinate sprints, retrospective meetings and daily stand-ups across teams
- Coach team members in Agile frameworks
- Facilitate internal communication and effective collaboration
- Be the point of contact for external communications (e. g. from customers or stakeholders)
- Work with product owners to handle backlogs and new requests
- Resolve conflicts and remove obstacles that occur
- Help teams implement changes effectively
- Ensure deliverables are up to quality standards at the end of each sprint
- Guide development teams to higher scrum maturity
- Help build a productive environment where team members own' the product and enjoy working on it
- High level understanding of various tech stacks, basics of API and microservices
Requirements:
- Leadership, Decision-Making skills.
- Responsibility.
- Effective communication.
- Attention to Detail.
- Task Delegation.
- Comfortable with participatory management.


2. Linux OS, shell scripting, Batch Processing ( 7+ years)
3. Troubleshooting Large Scale applicaton ( 7+ years)
4. Java, SpringBoot, MicroServices ( 5+ years desirable – not mandatory)
5. Agile Development Experience ( 5+ years)
6. Complete Development Cycle ( Dev, QA, UAT, Staging) ( 5+ years)
7. OpenShift PaaS Environment ( 5+ years desirable, not mandatory)
8. Good Oral and Written Communication Skills ( 5+ years )
9. AWS Cloud experience desirable (5+ years)
Azure, Azure AD, ADFS, Azure AD Connect, Microsoft Identity management |
Azure, Architecture, solution designing, Subscription Design |
Working closely with the Product group and other teams, the Data Engineer is responsible for the development, deployment and maintenance of our data infrastructure and applications. With a focus on quality, error-free data delivery, the Data Engineer works to ensure our data is appropriately available and fully supporting various constituencies across our organization. This is a multifaceted opportunity to work with a small, talented team on impactful projects that are essential to ACUE’s higher education success. As an early member of our tech group, you’ll have the unique opportunity to build critical systems and features while helping shape the direction of the team and the product.


