Job Description:
We are looking for a Big Data Engineer who have worked across the entire ETL stack. Someone who has ingested data in a batch and live stream format, transformed large volumes of daily and built Data-warehouse to store the transformed data and has integrated different visualization dashboards and applications with the data stores. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them.
Responsibilities:
- Develop, test, and implement data solutions based on functional / non-functional business requirements.
- You would be required to code in Scala and PySpark daily on Cloud as well as on-prem infrastructure
- Build Data Models to store the data in a most optimized manner
- Identify, design, and implement process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Implementing the ETL process and optimal data pipeline architecture
- Monitoring performance and advising any necessary infrastructure changes.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics experts to strive for greater functionality in our data systems.
- Proactively identify potential production issues and recommend and implement solutions
- Must be able to write quality code and build secure, highly available systems.
- Create design documents that describe the functionality, capacity, architecture, and process.
- Review peer-codes and pipelines before deploying to Production for optimization issues and code standards
Skill Sets:
- Good understanding of optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and ‘big data’ technologies.
- Proficient understanding of distributed computing principles
- Experience in working with batch processing/ real-time systems using various open-source technologies like NoSQL, Spark, Pig, Hive, Apache Airflow.
- Implemented complex projects dealing with the considerable data size (PB).
- Optimization techniques (performance, scalability, monitoring, etc.)
- Experience with integration of data from multiple data sources
- Experience with NoSQL databases, such as HBase, Cassandra, MongoDB, etc.,
- Knowledge of various ETL techniques and frameworks, such as Flume
- Experience with various messaging systems, such as Kafka or RabbitMQ
- Creation of DAGs for data engineering
- Expert at Python /Scala programming, especially for data engineering/ ETL purposes

About Ganit Business Solutions
About
Ganit Inc. is in the business of enhancing the Decision Making Power (DMP) of businesses by offering solutions that lie at the crossroads of discovery-based artificial intelligence, hypothesis-based analytics, and the Internet of Things (IoT).
The company's offerings consist of a functioning product suite and a bespoke service offering as its solutions. The goal is to integrate these solutions into the core of their client's decision-making processes as seamlessly as possible. Customers in the FMCG/CPG, Retail, Logistics, Hospitality, Media, Insurance, and Banking sectors are served by Ganit's offices in both India and the United States. The company views data as a strategic resource that may assist other businesses in achieving growth in both their top and bottom lines of business. We build and implement AI and ML solutions that are purpose-built for certain sectors to increase decision velocity and decrease decision risk.
Connect with the team
Company social profiles
Similar jobs
Role: Tele-Calling Executive
Experience: Freshers - 3 Years with good comm skills
Location: Noida
Key Responsibilities:
- Make outbound calls to prospective customers and handle inbound inquiries.
- Explain company products/services clearly and professionally.
- Understand customer requirements and qualify leads.
- Generate interested leads and schedule meetings/site visits for the sales team.
- Maintain accurate records of calls, leads, and follow-ups in CRM/Excel.
- Conduct timely follow-ups with potential customers.
- Track call outcomes and submit daily/weekly performance reports.
- Coordinate with sales and internal teams to ensure smooth customer handling.
- Build and maintain positive customer relationships.
Skills & Attributes:
- Minimum Qualification: Graduate (Mandatory).
- Good communication and interpersonal skills.
- Strong convincing and negotiation skills.
- Lead generation and lead qualification ability.
- Understanding of sales funnel and customer lifecycle.
- Ability to handle objections and nurture prospects.
- Strong follow-up and relationship-building skills.
- Basic computer knowledge (MS Excel, CRM tools preferred).
- Target-oriented mindset.
- Prior experience in tele calling, customer service, or lead handling preferred.
Work Location:
- Sector 90, Noida, Uttar Pradesh 201305
Key Responsibilities: 34249
- Feature Development: Design, develop, and maintain new features and enhancements across the stack.
- Front-End: Build intuitive, responsive UIs using Angular or React.
- Back-End: Develop scalable APIs and services using Python (preferred), Java/Spring, or Node.js.
- Cloud Deployment: Deploy and manage applications on Google Cloud Platform (GCP) — familiarity with services like App Engine, Cloud Functions, Kubernetes is expected.
- Performance Tuning: Identify and optimize performance bottlenecks.
- Code Quality: Participate in code reviews and maintain high standards through unit testing and automation.
- DevOps & CI/CD: Collaborate on deployment pipelines using Tekton, Terraform, and other DevOps tools.
- Cross-Functional Collaboration: Work closely with Product Managers, UI/UX Designers, and fellow Engineers in an agile environment.
Must-Have Skills:
- Strong development expertise in Python (preferred), Angular, and GCP
- Understanding of DevOps practices
- Experience with SDLC, agile methodologies, and unit testing
Good to Have (Nice-to-Haves):
- Hands-on experience with:
- Tekton, Terraform, CI/CD pipelines
- Large Language Models (LLMs) integration
- AWS/Azure (in addition to GCP)
- Contributions to open-source projects
- Familiarity with API design and microservices architecture
Educational Qualification:
- Required: Bachelor’s Degree in Computer Science, Engineering, or related discipline
Job Title: Senior Python Developer with AWS
Company: P99soft
Experience: 6+ years
Location: Hyderabad
Job Description:
P99soft, a leading technology company, is seeking a talented Senior Python Developer with expertise in AWS to join our dynamic team in Hyderabad. As a Senior Python Developer, you will play a key role in designing, implementing, and maintaining robust and scalable software solutions. If you have a passion for Python development and extensive experience with AWS technologies, we encourage you to apply and be a part of our innovative and collaborative work environment.
Responsibilities:
- Design, develop, test, and deploy high-performance and scalable Python applications.
- Collaborate with cross-functional teams to define and implement innovative solutions for the company's software needs.
- Work closely with stakeholders to understand business requirements and translate them into technical specifications.
- Utilize your expertise in AWS services to architect and implement cloud-based solutions.
- Troubleshoot and debug issues, ensuring the overall quality and performance of the software.
- Stay updated on industry trends and best practices to continuously improve software development processes.
Requirements:
- Bachelor's degree in Computer Science, Engineering, or a related field.
- Proven experience as a Senior Python Developer with a minimum of 6 years of hands-on development experience.
- Strong proficiency in Python and experience with popular frameworks such as Django or Flask.
- In-depth knowledge of AWS services and experience with cloud-based architecture.
- Solid understanding of software development principles, design patterns, and best practices.
- Experience with database systems, both SQL and NoSQL.
- Excellent communication and collaboration skills.
- Ability to work independently and as part of a team in a fast-paced environment.
How to Apply:
If you have a passion for backend development, possess the required experience, and are excited about contributing to innovative projects, we invite you to apply. Please submit your resume, cover letter, and any relevant portfolio or project samples to Anusha Kalidindi/Sravani Chiruvolu
P99soft is an equal opportunity employer, dedicated to fostering diversity and creating an inclusive workplace for all employees.
We are looking for a passionate technologist with experience in building SaaS tech experience and products for a once-in-a-lifetime opportunity to lead Engineering for an AI powered Financial Operations platform to seamlessly monitor, optimize, reconcile and forecast cashflow with ease.
Background
An incredible rare opportunity for a VP Engineering to join a top tier incubated VC SaaS startup and outstanding management team. Product is currently in the build stage with a solid design partners pipeline of ~$250K and soon raising a pre-seed/seed round with marquee investors.
Responsibilities
- Develop and implement the company's technical strategy and roadmap, ensuring that it aligns with the overall business objectives and is scalable, reliable, and secure.
- Manage and optimize the company's technical resources, including staffing, software, hardware, and infrastructure, to ensure that they are being used effectively and efficiently.
- Work with the founding team and other executives to identify opportunities for innovation and new technology solutions, and evaluate the feasibility and impact of these solutions on the business.
- Lead the engineering function in developing and deploying high-quality software products and solutions, ensuring that they meet or exceed customer requirements and industry standards.
- Analyze and evaluate technical data and metrics, identifying areas for improvement and implementing changes to drive efficiency and effectiveness.
- Ensure that the company is in compliance with all legal and regulatory requirements, including data privacy and security regulations.
Eligibility criteria:
- 6+ years of experience in developing scalable SaaS products.
- Strong technical background with 6+ years of experience with a strong focus on SaaS, AI, and finance software.
- Prior experience in leadership roles.
- Entrepreneurial mindset, with a strong desire to innovate and grow a startup from the ground up.
Perks:
- Vested Equity.
- Ownership in the company.
- Build alongside passionate and smart individuals.
About us
Skit (previously known as http://vernacular.ai/" target="_blank">Vernacular.ai) is an AI-first SaaS voice automation company. Its suite of speech and language solutions enable enterprises to automate their contact centre operations. With over 10 million hours of training data, its product - Vernacular Intelligent Voice Assistant (VIVA) can currently respond in 16+ languages, covering over 160+ dialects and replicating human-like conversations.
Skit currently serves a variety of enterprise clients across diverse sectors such as BFSI, F&B, Hospitality, Consumer Electronics and Travel & Tourism, including prominent clients like Axis Bank, Hathway, Porter and Barbeque Nation. It has been featured as one of the top-notch start-ups in the Cisco Launchpad’s Cohort 6 and is a part of the World Economic Forum’s Global Innovators Community. It has has also been listed in Forbes 30 Under 30 Asia start-ups 2021 for its remarkable industry innovation.
We are looking for ML Research Engineers to work on the following problems:
- Spoken Language Understanding and Dialog Management.
- Language semantics, parsing, and modeling across multiple languages.
- Speech Recognition, Speech Analytics and Voice Processing across multiple languages.
- Response Generation and Speech Synthesis.
- Active Learning, Monitoring and Observability mechanisms for deployments.
Responsibilities
- Design, build and evaluate Machine Learning solutions.
- Perform experiments and statistical analyses to draw conclusions and take modeling decisions.
- Study, implement and extend state of the art systems.
- Take part in regular research reviews and discussions.
- Build, maintain and extend our open source solutions in the domain.
- Write well-crafted programs at all levels of the system. This includes the data pipelines, experiment prototypes, fast and scalable deployment models, and evaluation, visualization and monitoring systems.
Requirements
- Practical Machine Learning experience as demonstrated by earlier works.
- Knowledge of and ability to use tools from theoretical and practical aspects of computer science. This includes, but is not limited to, probability, statistics, learning theory, algorithms, software architecture, programming languages, etc.
- Good programming skills and ability to work with programs at all levels of a finished Machine Learning product. We prefer language agnosticism since that exemplifies this point.
- Git portfolios and blogs are helpful as they let us better evaluate your work.
- What information we collect during our application and recruitment process and why we collect it;
- How we use that information; and
- How to access and update that information.
This policy covers the information you share with Skit (Cyllid Technologies Pvt. Ltd.) during the application or recruitment process including:
- Your name, address, email address, telephone number and other contact information;
- Your resume or CV, cover letter, previous and/or relevant work experience or other experience, education, transcripts, or other information you provide to us in support of an application and/or the application and recruitment process;
- Information from interviews and phone-screenings you may have, if any;
- Details of the type of employment you are or may be looking for, current and/or desired salary and other terms relating to compensation and benefits packages, willingness to relocate, or other job preferences;
- Details of how you heard about the position you are applying for;
- Reference information and/or information received from background checks (where applicable), including information provided by third parties;
- Information about your educational and professional background from publicly available sources, including online, that we believe is relevant to your application or a potential future application (e.g. your LinkedIn profile); and/or
- Information related to any assessment you may take as part of the interview screening process.
Your information will be used by Skit for the purposes of carrying out its application and recruitment process which includes:
- Assessing your skills, qualifications and interests against our career opportunities;
- Verifying your information and carrying out reference checks and/or conducting background checks (where applicable) if you are offered a job;
- Communications with you about the recruitment process and/or your application(s), including, in appropriate cases, informing you of other potential career opportunities at Skit;
- Creating and/or submitting reports as required under any local laws and/or regulations, where applicable;
- Making improvements to Skit's application and/or recruitment process including improving diversity in recruitment practices;
- Proactively conducting research about your educational and professional background and skills and contacting you if we think you would be suitable for a role with us.
Position: Full Stack Developer
Location: Hyderabad, India
Must have skills:
- Experience working with Typescript
- Experience working with Vue (including Vuex for state management, Vue cli etc.)
- Knowledge of one or more CSS preprocessors/JavaScript bundlers like SASS, Less, Webpack, Parcel, Rollup.
- integrating with Restful or other web services.
- Proficiency with Git
- Appreciation for clean and well documented code
- Thorough understanding of user experience and possibly even product strategy
- Information Security as Security Analyst (SOC)
- Good understanding of security solutions like Anti-virus, DLP, Proxy, Firewall filtering/monitoring, IPS, Email Security, EPO, WAF etc.
- Hands on experience withIBM QRadar, ArcSight SIEM tool for logs monitoring and analysis,Service now ticketing tool.
- Good knowledge on networking concepts including OSI layers,subnet, TCP/IP, ports, DNS, DHCP, firewall monitoring, content filtering, check point etc.








