
Key Responsibilities :
- Development of proprietary processes and procedures designed to process various data streams around critical databases in the org
- Manage technical resources around data technologies, including relational databases, NO SQL DBs, business intelligence databases, scripting languages, big data tools and technologies, visualization tools.
- Creation of a project plan including timelines and critical milestones to success in support of the project
- Identification of the vital skill sets/staff required to complete the project
- Identification of crucial sources of the data needed to achieve the objective.
Skill Requirement :
- Experience with data pipeline processes and tools
- Well versed in the Data domains (Data Warehousing, Data Governance, MDM, Data Quality, Data Catalog, Analytics, BI, Operational Data Store, Metadata, Unstructured Data, ETL, ESB)
- Experience with an existing ETL tool e.g Informatica and Ab initio etc
- Deep understanding of big data systems like Hadoop, Spark, YARN, Hive, Ranger, Ambari
- Deep knowledge of Qlik ecosystems like Qlikview, Qliksense, and Nprinting
- Python, or a similar programming language
- Exposure to data science and machine learning
- Comfort working in a fast-paced environment
Soft attributes :
- Independence: Must have the ability to work on his/her own without constant direction or supervision. He/she must be self-motivated and possess a strong work ethic to strive to put forth extra effort continually
- Creativity: Must be able to generate imaginative, innovative solutions that meet the needs of the organization. You must be a strategic thinker/solution seller and should be able to think of integrated solutions (with field force apps, customer apps, CCT solutions etc.). Hence, it would be best to approach each unique situation/challenge in different ways using the same tools.
- Resilience: Must remain effective in high-pressure situations, using both positive and negative outcomes as an incentive to move forward toward fulfilling commitments to achieving personal and team goals.

About ACT FIBERNET
About
ACT (Atria Convergence Technologies Pvt Ltd.) is one of the country’s most renowned triple play service providers with close to 1.5 million happy customers. We are on the threshold of being a 1000 crore company with a strong team of more than 6,500 employees across markets. Our customers currently experience the following state-of-the-art services under the ACT brand: • Fibernet (Internet over Fiber Optics) • HD TV • Digital TV Headquartered in Bangalore, ACT is spread across the length and breadth of Karnataka, Andhra Pradesh and Tamil Nadu. Pioneers in Fiber-To-The-Home technology,
ACT Fibernet is currently the largest non-telco and the fastest growing Internet Service Provider in the country. ACT is funded by IVFA (India Value Fund Advisors), a private equity investment fund responsible for building giants like Biocon, Radio City, HDFC Bank, just to name a few. The leadership team of ACT comprises 30+ professionals hailing from diverse industries like FMCG, Entertainment, Information & Technology, Telecom, and Retail.
Company video


Connect with the team
Similar jobs
The Opportunity
We’re looking for a Senior Data Engineer to join our growing Data Platform team. This role is a hybrid of data engineering and business intelligence, ideal for someone who enjoys solving complex data challenges while also building intuitive and actionable reporting solutions.
You’ll play a key role in designing and scaling the infrastructure and pipelines that power analytics, dashboards, machine learning, and decision-making across Sonatype. You’ll also be responsible for delivering clear, compelling, and insightful business intelligence through tools like Looker Studio and advanced SQL queries.
What You’ll Do
- Design, build, and maintain scalable data pipelines and ETL/ELT processes.
- Architect and optimize data models and storage solutions for analytics and operational use.
- Create and manage business intelligence reports and dashboards using tools like Looker Studio, Power BI, or similar.
- Collaborate with data scientists, analysts, and stakeholders to ensure datasets are reliable, meaningful, and actionable.
- Own and evolve parts of our data platform (e.g., Airflow, dbt, Spark, Redshift, or Snowflake).
- Write complex, high-performance SQL queries to support reporting and analytics needs.
- Implement observability, alerting, and data quality monitoring for critical pipelines.
- Drive best practices in data engineering and business intelligence, including documentation, testing, and CI/CD.
- Contribute to the evolution of our next-generation data lakehouse and BI architecture.
What We’re Looking For
Minimum Qualifications
- 5+ years of experience as a Data Engineer or in a hybrid data/reporting role.
- Strong programming skills in Python, Java, or Scala.
- Proficiency with data tools such as Databricks, data modeling techniques (e.g., star schema, dimensional modeling), and data warehousing solutions like Snowflake or Redshift.
- Hands-on experience with modern data platforms and orchestration tools (e.g., Spark, Kafka, Airflow).
- Proficient in SQL with experience in writing and optimizing complex queries for BI and analytics.
- Experience with BI tools such as Looker Studio, Power BI, or Tableau.
- Experience in building and maintaining robust ETL/ELT pipelines in production.
- Understanding of data quality, observability, and governance best practices.
Bonus Points
- Experience with dbt, Terraform, or Kubernetes.
- Familiarity with real-time data processing or streaming architectures.
- Understanding of data privacy, compliance, and security best practices in analytics and reporting.
Why You’ll Love Working Here
- Data with purpose: Work on problems that directly impact how the world builds secure software.
- Full-spectrum impact: Use both engineering and analytical skills to shape product, strategy, and operations.
- Modern tooling: Leverage the best of open-source and cloud-native technologies.
- Collaborative culture: Join a passionate team that values learning, autonomy, and real-world impact.
“Software testers have the heart of developers in a jar on the
desk”
HTTP 428 - TLDR
● You can find 13+ issues without using Google on this link
http://testingchallenges.thetestingmap.org/
● You often pick Python, PHP or Javascript to do the work for you
● Your friends can’t use your laptop as it’s running Linux
● Developers are afraid of you
● You find unconventional ways in your free time to find loopholes in
your favorite applications
● You can find flaws in your own code
If this sounds like you, join GreyB’s Red Team. Be a part of the crew using
your critical testing skills to help IP and R&D Leaders to hit their
technology-centric business goals.
A tester needs to be one step ahead of developers. So, you need to be
good with the LAMP stack (with some frameworks like Vue.js and Django)
and various JS libraries. Familiarity with databases (like Postgres, MySQL)
is a plus.
HTTP 201 - You can join us if
● You have a zeal for investigating. Looking for Sherlock Holmes of
the programming world
● You are HIGH on logic
● You can shift gears with multiple projects
● You pay attention to detail and analyze features from various angles
● You should be familiar with basic concepts of the web-like how does
the web work, How a page is rendered etc
● You should be able to work with quick changes and understand the
objectives
● You believe in automation more than manual. How about automating
the UX testing?
● Willing to identify new testing tools and approaches to help software
in advancing testing and security
● You’re interested in working on multiple testings like UI testing,
integration testing, load testing, and combatting abuse of our
products.
● Coding is your blood type
HTTP 100 - Good to have
● Hands-on experience with any testing framework
● Familiar with exploitation approaches
HTTP 418 - What you will be learning here?
● Define your own style of testing
● Collaborating with a team of executioners - developers, designers,
quality assurance experts, research analysts, and more
● Interacting with clients, understanding their problems & pain points
● Executing various levels of testing while understanding the domain
● Look through the user’s psychology of using products.
● Develop your strategic skills by planning and implementing the new
tests.
● Giving life to your ideas in Team Sprints and Hackathons.
● Have fun while working and the list doesn’t end here ;)
Responsibilities:
- Understand platform requirements & coordinate with the development agency for smooth development of our platform
- Make incremental design or developmental changes on the platform on a regular basis
- Ideate & develop new features on our platform
- Conduct regular testing of website performance & fixing bugs (either through agency or by self, whatever applicable)
- Work with UI/UX designer to implement design systems & user experiences
- Proven experience of 2-3 years as a full-stack developer or similar roles
- Experience in developing scalable desktop and mobile applications
- Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery)
- Knowledge of multiple back-end languages (e.g. C#, Java, Python) and JavaScript frameworks (e.g. Angular, React, Node.js)
- Hands on experience with databases (e.g. MySQL, MongoDB), web servers (e.g. Apache)
Signs you might be a good fit for this role:
- You are self-driven, scrappy and entrepreneurial
- You enjoy challenges and are excited to find simple solutions to complex problems
- You put users first
- You're a forever learner
- You want to work in a fast paced (read:messy) startup environment
- You're usually more aware of the hidden hacks & tools of a phone or a computer than most other people
- You like challenges & don't get bogged down by failure easily
- You are firm believer of the quote "if you're not a part of the solution, you're part of the problem"
See you in the other side!
- Understand the fundamentals of Software Engineering, such as Data Structures, Algorithms, Design Patterns
- Ability to write Java applications using Spring, Springboot or any other microservices frameworks
- Experience in the development of REST applications, in Java or Kotlin
- Capable of writing effective APIs
- Solid knowledge of JVM fundamentals such as classloading, memory management, garbage collection
- Demonstrated experience in platform API design and development
- Knowledge on microservice and event-driven architecture
- Experience in working with version control frameworks such as Git (preferable), SVN
- Experience with NoSQL databases (MongoDB or Cassandra) & Relational Databases
- Solid understanding of TDD & Agile principles such as CI / CD, with a proven track record of implementing solutions centred around those concepts
- Excellent communication, collaboration, reporting, analytical and problem solving skills
Our frontend (React) team is passionate about React Technology and we are working on a few world-class products which have their roots and frontend in React Technology along with Redux, web-hooks, and Reducers. Keeping performance, scalability, usability, and user acceptance in mind, we are looking for a smart frontend engineer that has experience developing a variety of Web Applications using React Technology. (Along with strong frontend expertise, needs a good understanding of how frontend applications work in different browsers and have a mobile responsive design)
Roles & Responsibilities
Basic roles and responsibilities are mentioned for the engineer at our team:
- Strong proficiency in JavaScript, including DOM manipulation and the JavaScript object model
- You will ensure that these components and the overall application are robust and easy to maintain.
- You will coordinate with the rest of the team working on different layers of the infrastructure.
- Good understanding of async functions and their concepts.
- Your primary focus will be on developing user interface components and implementing them following well-known ReactJS workflows (such as Flux or Redux).
- A Commitment to collaborative problem solving, sophisticated design, and quality product is important.
Responsibilities.
- Developing new user-facing features using ReactJS
- Familiarity with modern front-end builds pipelines and tools.
- Translating designs and wireframes into high-quality code.
- Thorough understanding of ReactJS and its core principles.
- Experience with popular React.js workflows. (such as Flux or Redux)
- Familiarity with RESTful APIs.
- Building reusable components and front-end libraries for future use.
- Knowledge of modern authorization mechanisms, such as JSON Web Token and Oauth.
- Good knowledge of CSS and other latest frameworks like a tailwind CSS / Bootstrap.
Person Specification and Qualifications
- Experience developing highly scalable Web applications.
- Knowledge of Media Queries and CSS via useful frameworks.
- Strong experience with React frameworks with components like Redux, Flux, Webhooks.
- Good knowledge of code versioning (git / Github)
- Good experience with code deployment on servers like Ubuntu, Linux systems.
- In-depth knowledge of designing and developing software in distributed architectures for multi-tier applications
- Basic understanding of docker to work with different major versions of Node js and database release(Postgres, MySQL, etc)
- Good knowledge to write and train the team to write unit test cases for the application code.
How to Code Review
- You would be knowing how to do a basic code review of team and junior NodeJS engineers.
Plus points if you're familiar with the following:
- Experience with deployment and CI/CD is a plus!
- Have worked with Testing libraries
- Good Verbal and Written communication Skills.
- Designing and developing new web applications.
- Maintaining and troubleshooting existing web applications.
- Writing and maintaining reliable Ruby code.
- Integrating data storage solutions.
- Creating back-end components.
- Identifying and fixing bottlenecks and bugs.
- Integrating user-facing elements designed by the front-end team.
- Connecting applications with additional web servers.
- Maintaining APIs.
Location: Chennai- Guindy Industrial Estate
Duration: Full time role
Company: Mobile Programming (https://www.mobileprogramming.com/" target="_blank">https://www.
Client Name: Samsung
We are looking for a Data Engineer to join our growing team of analytics experts. The hire will be
responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing
data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline
builder and data wrangler who enjoy optimizing data systems and building them from the ground up.
The Data Engineer will support our software developers, database architects, data analysts and data
scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout
ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple
teams, systems and products.
Responsibilities for Data Engineer
Create and maintain optimal data pipeline architecture,
Assemble large, complex data sets that meet functional / non-functional business requirements.
Identify, design, and implement internal process improvements: automating manual processes,
optimizing data delivery, re-designing infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data
from a wide variety of data sources using SQL and AWS big data technologies.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency and other key business performance metrics.
Work with stakeholders including the Executive, Product, Data and Design teams to assist with
data-related technical issues and support their data infrastructure needs.
Create data tools for analytics and data scientist team members that assist them in building and
optimizing our product into an innovative industry leader.
Work with data and analytics experts to strive for greater functionality in our data systems.
Qualifications for Data Engineer
Experience building and optimizing big data ETL pipelines, architectures and data sets.
Advanced working SQL knowledge and experience working with relational databases, query
authoring (SQL) as well as working familiarity with a variety of databases.
Experience performing root cause analysis on internal and external data and processes to
answer specific business questions and identify opportunities for improvement.
Strong analytic skills related to working with unstructured datasets.
Build processes supporting data transformation, data structures, metadata, dependency and
workload management.
A successful history of manipulating, processing and extracting value from large disconnected
datasets.
Working knowledge of message queuing, stream processing and highly scalable ‘big data’ data
stores.
Strong project management and organizational skills.
Experience supporting and working with cross-functional teams in a dynamic environment.
We are looking for a candidate with 3-6 years of experience in a Data Engineer role, who has
attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
Experience with big data tools: Spark, Kafka, HBase, Hive etc.
Experience with relational SQL and NoSQL databases
Experience with AWS cloud services: EC2, EMR, RDS, Redshift
Experience with stream-processing systems: Storm, Spark-Streaming, etc.
Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc.
Skills: Big Data, AWS, Hive, Spark, Python, SQL









