
Similar jobs

Role Objective:
Big Data Engineer will be responsible for expanding and optimizing our data and database architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building. The Data Engineer will support our software developers, database architects, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products
Roles & Responsibilities:
- Sound knowledge in Spark architecture and distributed computing and Spark streaming.
- Proficient in Spark – including RDD and Data frames core functions, troubleshooting and performance tuning.
- Good understanding in object-oriented concepts and hands on experience on Scala with excellent programming logic and technique.
- Good in functional programming and OOPS concept on Scala
- Good experience in SQL – should be able to write complex queries.
- Managing the team of Associates and Senior Associates and ensuring the utilization is maintained across the project.
- Able to mentor new members for onboarding to the project.
- Understand the client requirement and able to design, develop from scratch and deliver.
- AWS cloud experience would be preferable.
- Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services - DynamoDB, RedShift, Kinesis, Lambda, S3, etc. (preferred)
- Hands on experience utilizing AWS Management Tools (CloudWatch, CloudTrail) to proactively monitor large and complex deployments (preferred)
- Experience in analyzing, re-architecting, and re-platforming on-premises data warehouses to data platforms on AWS (preferred)
- Leading the client calls to flag off any delays, blockers, escalations and collate all the requirements.
- Managing project timing, client expectations and meeting deadlines.
- Should have played project and team management roles.
- Facilitate meetings within the team on regular basis.
- Understand business requirement and analyze different approaches and plan deliverables and milestones for the project.
- Optimization, maintenance, and support of pipelines.
- Strong analytical and logical skills.
- Ability to comfortably tackling new challenges and learn
External Skills And Expertise
Must have Skills:
- Scala
- Spark
- SQL (Intermediate to advanced level)
- Spark Streaming
- AWS preferable/Any cloud
- Kafka /Kinesis/Any streaming services
- Object-Oriented Programming
- Hive, ETL/ELT design experience
- CICD experience (ETL pipeline deployment)
Good to Have Skills:
- AWS Certification
- Git/similar version control tool
- Knowledge in CI/CD, Microservices

● Hands-on coding on Java/ GoLang, primarily for testing.
● Hands-on experience on more of TestNG, JUnit, Spring Test, GoLang testing, etc. OR
If you are a motivated developer who wants to move to testing.
● Ability to find bottlenecks & thresholds in existing code with the help of automation
tools.
● Understanding of Object Oriented Design.
● Crisp understanding of various testing methodologies and categories.
● Ability to come up with, document and code test scenarios & test cases.
● Experience of working on ‘Agile + DevOps’ process management methodology.
● Experience in using one or more of RestAssured, SuperTest, Postman, Swagger.
Good To Have :
● Knowledge of other programming languages, like Javascript, Python etc
● Experience in using mocking frameworks
● Experience in using API testing frameworks
● Experience in Performance testing frameworks and the ability to design performance
tests
● Experience in some scripting languages, like Shell, Python etc.
● Good communication skills in English, both written and verbal
● Valid US Business visa
Roles & Responsibilities :
● Perform Test Automations, including creation & management of test scenarios,
documentation and coding tests.
● Environment set up for testing applications across channels like Web, Mobile and
Desktop as well as backend applications involving large scale data migration
● Create & manage automation reports and regularly communicate the same to the
team.
● Work with deployments teams and resolve any level issues for the system.


DevDarshan is a devotional platform launched by IIT graduates to promote the teachings of Indian culture and the Hindu way of life in India around the world. In the 21st century, where everything around is digitized then why not temples. That’s the idea behind DevDarshan.We’ve built a community of devotees from multiple Countries, through our Mobile Application that connects Temples and Devotees, have successfully raised seed investment and also started to generate revenue for the temples and Priests associated with us.Right now we are looking to grow our team and build new exciting features for devotees all around the world.
This is where you come in.
We are looking for a passionate and self-motivated individual to help our Web frontend too.
Requirements:
- Strong Design and User Experience. Have worked on building High Quality Web experiences and extremely focussed on coding for the best User Experience
- Experience in any frontend framework like ReactJS, VueJs
- Good understanding and experience of NoSQL and SQL Databases, which to be used when.
- Experience with CI/CD Systems like Jenkins, Github Actions.
- Some Experience with Realtime Databases/Systems or Socket based applications would be preferred.
- Some Experience with building Algorithms, Social Apps is preferred.
- Any experience with Handling Video Delivery like ffmpeg/HLS/WebRTC is preferred but not mandatory.
The Role
- You will be involved at all stages of the product development process, from design to development and deployment.
- You possess a passion for improving techniques, processes, tracking, and continuously improve our engineering practices and would work on a daily basis towards that


Django Python framework
PostgreSQL database
- Coordinating with development teams to find out the needs of the application.
- Using the Python programming language to create scalable code.
- Application testing and bug fixing.
- Creating the back-end elements.
- Utilising server-side logic to incorporate user-facing components.
- Integrating storage methods for data.
- Design and implementation of high-performance, low-latency applications.
- Working in concert with front-end programmers.
- Upgrading the functionality of current databases.
- Creating digital technologies to track online activity.
Job description Position: Data Engineer Experience: 6+ years Work Mode: Work from Office Location: Bangalore Please note: This position is focused on development rather than migration. Experience in Nifi or Tibco is mandatory.Mandatory Skills: ETL, DevOps platform, Nifi or Tibco We are seeking an experienced Data Engineer to join our team. As a Data Engineer, you will play a crucial role in developing and maintaining our data infrastructure and ensuring the smooth operation of our data platforms. The ideal candidate should have a strong background in advanced data engineering, scripting languages, cloud and big data technologies, ETL tools, and database structures.
Responsibilities: • Utilize advanced data engineering techniques, including ETL (Extract, Transform, Load), SQL, and other advanced data manipulation techniques. • Develop and maintain data-oriented scripting using languages such as Python. • Create and manage data structures to ensure efficient and accurate data storage and retrieval. • Work with cloud and big data technologies, specifically AWS and Azure stack, to process and analyze large volumes of data. • Utilize ETL tools such as Nifi and Tibco to extract, transform, and load data into various systems. • Have hands-on experience with database structures, particularly MSSQL and Vertica, to optimize data storage and retrieval. • Manage and maintain the operations of data platforms, ensuring data availability, reliability, and security. • Collaborate with cross-functional teams to understand data requirements and design appropriate data solutions. • Stay up-to-date with the latest industry trends and advancements in data engineering and suggest improvements to enhance our data infrastructure.
Requirements: • A minimum of 6 years of relevant experience as a Data Engineer. • Proficiency in ETL, SQL, and other advanced data engineering techniques. • Strong programming skills in scripting languages such as Python. • Experience in creating and maintaining data structures for efficient data storage and retrieval. • Familiarity with cloud and big data technologies, specifically AWS and Azure stack. • Hands-on experience with ETL tools, particularly Nifi and Tibco. • In-depth knowledge of database structures, including MSSQL and Vertica. • Proven experience in managing and operating data platforms. • Strong problem-solving and analytical skills with the ability to handle complex data challenges. • Excellent communication and collaboration skills to work effectively in a team environment. • Self-motivated with a strong drive for learning and keeping up-to-date with the latest industry trends.

We are hiring ReactJS Developers.
Experience - 3-5 years
Location - Ahmedabad
5 Days working
Required Skills:
- Should be able to do code review and deployments.
- Should have experience in JavaScript, including DOM manipulation and the JavaScript object model.
- Thorough understanding of React.js and its core principles along with React-hooks, CSS3, and HTML5.
- Experience with popular React.js workflows ( Redux - Saga )
- Familiarity with newer specifications of ECMAScript 6
- Experience with data Structures libraries ( e.g., Immutable.js )
- Familiarity with RESTful APIs
- Experience with common front-end development tools such as Babel, Webpack, NPM, etc
- A knack for benchmarking and optimization
- Familiarity with code versioning tools such GIT.
- Familiarity with tools like React-Redux, immer, Qwest, connected-react-router, TSlint
- UI frameworks like Material UI and Kendo UI
- Familiar with Unit testing libraries like Jest-dom etc
- Familiar with browser-based debugging tools like React DevTools, React Developer tools, etc.
- Familiar with Webpack
As AWS Data Engineer, you are a full-stack data engineer that loves solving business problems. You work with business leads, analysts, and data scientists to understand the business domain and engage with fellow engineers to build data products that empower better decision-making. You are passionate about the data quality of our business metrics and the flexibility of your solution that scales to respond to broader business questions.
If you love to solve problems using your skills, then come join the Team Mactores. We have a casual and fun office environment that actively steers clear of rigid "corporate" culture, focuses on productivity and creativity, and allows you to be part of a world-class team while still being yourself.
What you will do?
- Write efficient code in - PySpark, Amazon Glue
- Write SQL Queries in - Amazon Athena, Amazon Redshift
- Explore new technologies and learn new techniques to solve business problems creatively
- Collaborate with many teams - engineering and business, to build better data products and services
- Deliver the projects along with the team collaboratively and manage updates to customers on time
What are we looking for?
- 1 to 3 years of experience in Apache Spark, PySpark, Amazon Glue
- 2+ years of experience in writing ETL jobs using pySpark, and SparkSQL
- 2+ years of experience in SQL queries and stored procedures
- Have a deep understanding of all the Dataframe API with all the transformation functions supported by Spark 2.7+
You will be preferred if you have
- Prior experience in working on AWS EMR, Apache Airflow
- Certifications AWS Certified Big Data – Specialty OR Cloudera Certified Big Data Engineer OR Hortonworks Certified Big Data Engineer
- Understanding of DataOps Engineering
Life at Mactores
We care about creating a culture that makes a real difference in the lives of every Mactorian. Our 10 Core Leadership Principles that honor Decision-making, Leadership, Collaboration, and Curiosity drive how we work.
1. Be one step ahead
2. Deliver the best
3. Be bold
4. Pay attention to the detail
5. Enjoy the challenge
6. Be curious and take action
7. Take leadership
8. Own it
9. Deliver value
10. Be collaborative
We would like you to read more details about the work culture on https://mactores.com/careers
The Path to Joining the Mactores Team
At Mactores, our recruitment process is structured around three distinct stages:
Pre-Employment Assessment:
You will be invited to participate in a series of pre-employment evaluations to assess your technical proficiency and suitability for the role.
Managerial Interview: The hiring manager will engage with you in multiple discussions, lasting anywhere from 30 minutes to an hour, to assess your technical skills, hands-on experience, leadership potential, and communication abilities.
HR Discussion: During this 30-minute session, you'll have the opportunity to discuss the offer and next steps with a member of the HR team.
At Mactores, we are committed to providing equal opportunities in all of our employment practices, and we do not discriminate based on race, religion, gender, national origin, age, disability, marital status, military status, genetic information, or any other category protected by federal, state, and local laws. This policy extends to all aspects of the employment relationship, including recruitment, compensation, promotions, transfers, disciplinary action, layoff, training, and social and recreational programs. All employment decisions will be made in compliance with these principles.

Core Java, SpringBoot, MicroServices |
- DB2 or any RDBMS database application development |
- Linux OS, shell scripting, Batch Processing |
- Troubleshooting Large Scale application |
- Experience in automation and unit test framework is a must |
- AWS Cloud experience desirable |
- Agile Development Experience |
- Complete Development Cycle ( Dev, QA, UAT, Staging) |
- Good Oral and Written Communication Skills |

Required Skills:
- Proven work experience as an Enterprise / Data / Analytics Architect - Data Platform in HANA XSA, XS, Data Intelligence and SDI
- Can work on new and existing architecture decision in HANA XSA, XS, Data Intelligence and SDI
- Well versed with data architecture principles, software / web application design, API design, UI / UX capabilities, XSA / Cloud foundry architecture
- In-depth understand of database structure (HANA in-memory) principles.
- In-depth understand of ETL solutions and data integration strategy.
- Excellent knowledge of Software and Application design, API, XSA, and microservices concepts
Roles & Responsibilities:
- Advise and ensure compliance of the defined Data Architecture principle.
- Identifies new technologies update and development tools including new release/upgrade/patch as required.
- Analyzes technical risks and advises on risk mitigation strategy.
- Advise and ensures compliance to existing and development required data and reporting standard including naming convention.
The time window is ideally AEST (8 am till 5 pm) which means starting at 3:30 am IST. We understand it can be very early for an SME supporting from India. Hence, we can consider the candidates who can support from at least 7 am IST (earlier is possible).


