50+ SQL Jobs in India
Apply to 50+ SQL Jobs on CutShort.io. Find your next job, effortlessly. Browse SQL Jobs and apply today!
Who We Are
At Sonatype, we help organizations build better, more secure software by enabling them to understand and control their software supply chains. Our products are trusted by thousands of engineering teams globally, providing critical insights into dependency health, license risk, and software security. We’re passionate about empowering developers—and we back it with data.
The Opportunity
We’re looking for a Data Engineer with full stack expertise to join our growing Data Platform team. This role blends data engineering, microservices, and full-stack development to deliver end-to-end services that power analytics, machine learning, and advanced search across Sonatype.
You will design and build data-driven microservices and workflows using Java, Python, and Spring Batch, implement frontends for data workflows, and deploy everything through CI/CD pipelines into AWS ECS/Fargate. You’ll also ensure services are monitorable, debuggable, and reliable at scale, while clearly documenting designs with Mermaid-based sequence and dataflow diagrams.
This is a hands-on engineering role for someone who thrives at the intersection of data systems, fullstack development, ML, and cloud-native platforms.
What You’ll Do
- Design, build, and maintain data pipelines, ETL/ELT workflows, and scalable microservices.
- Development of complex web scraping (Playwright) and realtime pipelines (Kafka/Queues/Flink).
- Develop end-to-end microservices with backend (Java 5+, Python 5+, Spring Batch 2+) and frontend (React or any).
- Deploy, publish, and operate services in AWS ECS/Fargate using CI/CD pipelines (Jenkins, GitOps).
- Architect and optimize data storage models in SQL (MySQL, PostgreSQL) and NoSQL stores.
- Implement web scraping and external data ingestion pipelines.
- Enable Databricks and PySpark-based workflows for large-scale analytics.
- Build advanced data search capabilities (fuzzy matching, vector similarity search, semantic retrieval).
- Apply ML techniques (scikit-learn, classification algorithms, predictive modeling) to data-driven solutions.
- Implement observability, debugging, monitoring, and alerting for deployed services.
- Create Mermaid sequence diagrams, flowcharts, and dataflow diagrams to document system architecture and workflows.
- Drive best practices in fullstack data service development, including architecture, testing, and documentation.
What We’re Looking For
- 5+ years of experience as a Data Engineer or a Software Backend engineering role.
- Strong programming skills in Python, Scala, or JavaHands-on experience with HBase or similar NoSQL columnar stores.
- Hands-on experience with distributed data systems like Spark, Kafka, or Flink.
- Proficient in writing complex SQL and optimizing queries for performance.
- Experience building and maintaining robust ETL/ELT pipelines in production.
- Familiarity with workflow orchestration tools (Airflow, Dagster, or similar).
- Understanding of data modeling techniques (star schema, dimensional modeling, etc.).
- Familiarity with CI/CD pipelines (Jenkins or similar).
- Ability to visualize and communicate architectures using Mermaid diagrams.
Bonus Points
- Experience working with Databricks, dbt, Terraform, or Kubernetes
- Familiarity with streaming data pipelines or real-time processing
- Exposure to data governance frameworks and tools
- Experience supporting data products or ML pipelines in production
- Strong understanding of data privacy, security, and compliance best practices
Why You’ll Love Working Here
- Data with purpose: Work on problems that directly impact how the world builds secure software
- Modern tooling: Leverage the best of open-source and cloud-native technologies
- Collaborative culture: Join a passionate team that values learning, autonomy, and impact
As a Software Development Engineer II at Hiver, you will have a critical role to play to build and scale our product to thousands of users globally. We are growing very fast and process over 5 million emails daily for thousands of active users on the Hiver platform. You will get a chance to work with and mentor a group of smart engineers as well as learn and grow yourself working with very good mentors. You’ll get the opportunity to work on complex technical challenges such as making the architecture scalable to handle the growing traffic, building frameworks to monitor and improve the performance of our systems, and improving reliability and performance. Code, design, develop and maintain new product features. Improve the existing services for performance and scalability.
What will you be working on?
- Build a new API for our users, or iterate on existing APIs in monolith applications.
- Build event-driven architecture using highly scalable message brokers like Kafka, RabbitMQ, etc.
- Build microservices based on the performance and efficiency needs.
- Build frameworks to monitor and improve the performance of our systems.
- Build and upgrade systems to securely store sensitive data.
- Design, build and maintain APIs, services, and systems across Hiver's engineering teams.
- Debug production issues across services and multiple levels of the stack.
- Work with engineers across the company to build new features at large-scale.
- Improve engineering standards, tooling, and processes.
What we are looking for?
- Have worked in scaling backend systems over at least 3 years.
- Knowledge of Ruby on Rails (RoR) or Pythonis good to have, with hands-on experience in at least one project.
- Have worked on the microservice and event-driven architecture.
- Have worked on technologies like Kafka in building data pipelines with a high volume of data.
- Enjoy and have experience building Lean APIs and amazing backend services.
- Think about systems and services and write high-quality code. We care much more about your general engineering skill than - knowledge of a particular language or framework.
- Have worked extensively with SQL Databases and understand NoSQL databases and Caches.
- Have experience deploying applications on the cloud. We are on AWS, but experience with any cloud provider (GCP, Azure) would be great.
- Hold yourself and others to a high bar when working with production systems.
- Take pride in working on projects to successful completion involving a wide variety of technologies and systems.
- Thrive in a collaborative environment involving different stakeholders.
- Enjoy working with a diverse group of people with different expertise.
Job Overview:
We are seeking a highly skilled Data Analyst with 6+ years of experience in analytics, data modeling, and advanced SQL. The ideal candidate has strong expertise in building scalable data models using dbt, writing efficient Python scripts, and delivering high-quality insights that support data-driven decision-making.
Key Responsibilities
- Design, develop, and maintain data models using dbt (Core and dbt Cloud).
- Build and optimize complex SQL queries to support reporting, analytics, and data pipelines.
- Write Python scripts for data transformation, automation, and analytics workflows.
- Ensure data quality, integrity, and consistency across multiple data sources.
- Collaborate with cross-functional teams (Engineering, Product, Business) to understand data needs.
- Develop dashboards and reports to visualize insights (using tools such as Tableau, Looker, or Power BI).
- Perform deep-dive exploratory analysis to identify trends, patterns, and business opportunities.
- Document data models, pipelines, and processes.
- Contribute to scaling the analytics stack and improving data architecture.
Required Qualifications
- 6 - 9 years of hands-on experience in data analytics or data engineering.
- Expert-level skills in SQL (complex joins, window functions, performance tuning).
- Strong experience building and maintaining dbt data models.
- Proficiency in Python for data manipulation, scripting, and automation.
- Solid understanding of data warehousing concepts (e.g., dimensional modeling, ELT/ETL pipelines).
- Understanding with cloud data platforms (Snowflake, BigQuery, Redshift, etc.).
- Strong analytical thinking and problem-solving skills.
- Excellent communication skills with the ability to present insights to stakeholders.
- Trino and lakehouse architecture experience good to have
About Tarento:
Tarento is a fast-growing technology consulting company headquartered in Stockholm, with a strong presence in India and clients across the globe. We specialize in digital transformation, product engineering, and enterprise solutions, working across diverse industries including retail, manufacturing, and healthcare. Our teams combine Nordic values with Indian expertise to deliver innovative, scalable, and high-impact solutions.
We're proud to be recognized as a Great Place to Work, a testament to our inclusive culture, strong leadership, and commitment to employee well-being and growth. At Tarento, you’ll be part of a collaborative environment where ideas are valued, learning is continuous, and careers are built on passion and purpose.
Scope of Work:
- Support the migration of applications from Windows Server 2008 to Windows Server 2019 or 2022 in an IaaS environment.
- Migrate IIS websites, Windows Services, and related application components.
- Assist with migration considerations for SQL Server connections, instances, and basic data-related dependencies.
- Evaluate and migrate message queues (MSMQ or equivalent technologies).
- Document the existing environment, migration steps, and post-migration state.
- Work closely with DevOps, development, and infrastructure teams throughout the project.
Required Skills & Experience:
- Strong hands-on experience with IIS administration, configuration, and application migration.
- Proven experience migrating workloads between Windows Server versions, ideally legacy to modern.
- Knowledge of Windows Services setup, configuration, and troubleshooting.
- Practical understanding of SQL Server (connection strings, service accounts, permissions).
- Experience with queues IBM/MSMQ or similar) and their migration considerations.
- Ability to identify migration risks, compatibility constraints, and remediation options.
- Strong troubleshooting and analytical skills.
- Familiarity with Microsoft technologies (.Net, etc)
- Networking and Active Directory related knowledge
Desirable / Nice-to-Have
- Exposure to CI/CD tools, especially TeamCity and Octopus Deploy.
- Familiarity with Azure services and related tools (Terraform, etc)
- PowerShell scripting for automation or configuration tasks.
- Understanding enterprise change management and documentation practices.
- Security
Soft Skills
- Clear written and verbal communication.
- Ability to work independently while collaborating with cross-functional teams.
- Strong attention to detail and a structured approach to execution.
- Troubleshooting
- Willingness to learn.
Location & Engagement Details
We are looking for a Senior DevOps Consultant for an onsite role in Stockholm (Sundbyberg office). This opportunity is open to candidates currently based in Bengaluru who are willing to relocate to Sweden for the assignment.
The role will start with an initial 6-month onsite engagement, with the possibility of extension based on project requirements and performance.
About the Role
We are looking for a motivated Full Stack Developer with 2–5 years of hands-on experience in building scalable web applications. You will work closely with senior engineers and product teams to develop new features, improve system performance, and ensure high-
quality code delivery.
Responsibilities
- Develop and maintain full-stack applications.
- Implement clean, maintainable, and efficient code.
- Collaborate with designers, product managers, and backend engineers.
- Participate in code reviews and debugging.
- Work with REST APIs/GraphQL.
- Contribute to CI/CD pipelines.
- Ability to work independently as well as within a collaborative team environment.
Required Technical Skills
- Strong knowledge of JavaScript/TypeScript.
- Experience with React.js, Next.js.
- Backend experience with Node.js, Express, NestJS.
- Understanding of SQL/NoSQL databases.
- Experience with Git, APIs, debugging tools.ß
- Cloud familiarity (AWS/GCP/Azure).
AI and System Mindset
Experience working with AI-powered systems is a strong plus. Candidates should be comfortable integrating AI agents, third-party APIs, and automation workflows into applications, and should demonstrate curiosity and adaptability toward emerging AI technologies.
Soft Skills
- Strong problem-solving ability.
- Good communication and teamwork.
- Fast learner and adaptable.
Education
Bachelor's degree in Computer Science / Engineering or equivalent.
Qualifications:
- Must have a Bachelor’s degree in computer science or equivalent.
- Must have at least 5+ years’ experience as a SDET.
- At least 1+ year of leadership experience or managing a team.
Responsibilities:
- Design, develop and execute automation scripts using open-source tools.
- Troubleshooting any errors and streamlining the testing procedures.
- Writing and executing detailed test plans, test design & test cases covering feature, integration, regression, certification, system level testing as well as release validation in production.
- Identify, analyze and create detailed records of problems that appear during testing, such as software defects, bugs, functionality issues, and output errors, and work directly with software developers to find solutions and develop retesting procedures.
- Good time-management skills and commitment to meet deadlines.
- Stay up-to-date with new testing tools and test strategies.
- Driving technical projects and providing leadership in an innovative and fast-paced environment.
Requirements:
- Experience in the Automation - API and UI as well as Manual Testing on Web Application.
- Experience in frameworks like Playwright / Selenium Web Driver / Robot Framework / Rest-Assured.
- Must be proficient in Performance Testing tools like K6 / Gatling / JMeter.
- Must be proficient in Core Java / Type Script and Java 17.
- Experience in JUnit-5.
- Good to have TypeScript experience.
- Good to have RPA Experience using Java or any other tools like Robot Framework / Automation Anywhere.
- Experience in SQL (like MySQL, PG) & No-SQL Database (like MongoDB).
- Good understanding of software & systems architecture.
- Well acquainted with Agile Methodology, Software Development Life Cycle (SDLC), Software Test Life Cycle (STLC) and Automation Test Life Cycle.
- Strong experience REST based components testing, back-end, DB and micro services testing.
Work Location: Jayanagar - Bangalore.
Experience: 2+ years Must-Have: Candidate must have prior experience in a product-based company
Role Summary: We are looking for a passionate Product Manager / APM to own and enhance the end-to-end product experience for both FOY Store (India & Global) and Personalise Me (Skin AI & Makeup Try-On). You will drive conversion, revenue, personalization, customer experience, and operational efficiency while collaborating closely with cross-functional teams including engineering, marketing, cataloge, operations, and data/ML teams.
Key Responsibilities: FOY Store:
Own the full customer journey: CTR → ATC → Checkout → Purchase → Repeat Define assortment strategy, navigation, product discovery, search, filters, PLPs, PDPs Collaborate with brand, cataloge, marketing, and operations for pricing, availability, and content accuracy Run rapid A/B experiments to optimize funnel and conversion Build scalable product integrations with payments, logistics, loyalty, and subscriptions Define product roadmap and write PRDs / user stories for engineering Track and improve store GMV, margins, retention, cancellations, COD risk Personalise Me (Skin AI + Makeup Try-On):
Own the hyper-personalized beauty experience: Skin AI test, Virtual Try-On, BeautyGPT Collaborate with data/ML teams to improve recommendation accuracy Understand beauty user profiles, concerns, and preferences deeply Integrate personalized recommendations into the shopping journey to boost conversion Drive metrics: activation → profile completion → recommendation clicks → purchase Work with brand and catalog teams to tag inventory for personalization
Must-Have Skills:
Strong analytical mindset + customer psychology understanding UI/UX intuition for ecommerce and personalization best practices Strong Google Sheets & Excel skills SQL proficiency Experience with funnels, Clevertap/GA, AB testing tools Customer empathy, problem-solving, and curiosity for beauty tech and AI
Bonus Skills:
Experience with ecommerce marketplaces, D2C, or AI-driven recommendation systems Experience with personalization, gamification, or form-based flows Knowledge of AI tools and product integrations
Why Join Us: Be part of a dynamic team shaping the future of beauty commerce, blending cutting-edge AI with customer-first product experiences.
🚀 Hiring: Java Developer at Deqode
⭐ Experience: 4+ Years
📍 Location: Indore, Pune, Mumbai, Nagpur, Noida, Kolkata, Bangalore,Chennai
⭐ Work Mode:- Hybrid
⏱️ Notice Period: Immediate Joiners
(Only immediate joiners & candidates serving notice period)
Requirements
✅ Strong proficiency in Java (Java 8/11/17)
✅ Experience with Spring / Spring Boot
✅ Knowledge of REST APIs, Microservices architecture
✅ Familiarity with SQL/NoSQL databases
✅ Understanding of Git, CI/CD pipelines
✅ Problem-solving skills and attention to detail
Like us, you'll be deeply committed to delivering impactful outcomes for customers.
- 7+ years of demonstrated ability to develop resilient, high-performance, and scalable code tailored to application usage demands.
- Ability to lead by example with hands-on development while managing project timelines and deliverables. Experience in agile methodologies and practices, including sprint planning and execution, to drive team performance and project success.
- Deep expertise in Node.js, with experience in building and maintaining complex, production-grade RESTful APIs and backend services.
- Experience writing batch/cron jobs using Python and Shell scripting.
- Experience in web application development using JavaScript and JavaScript libraries.
- Have a basic understanding of Typescript, JavaScript, HTML, CSS, JSON and REST based applications.
- Experience/Familiarity with RDBMS and NoSQL Database technologies like MySQL, MongoDB, Redis, ElasticSearch and other similar databases.
- Understanding of code versioning tools such as Git.
- Understanding of building applications deployed on the cloud using Google cloud platform(GCP)or Amazon Web Services (AWS)
- Experienced in JS-based build/Package tools like Grunt, Gulp, Bower, Webpack.
About the company:
Aptroid Consulting (India) Pvt Ltd is a Web Development company focused on helping marketers transforms the customer experience increasing engagement and driving revenue, customer data to inform and drive it in every interaction in real time and with each individual behavior possibly.
About the Role:
We are hiring for Senior Full Stack Developers to strengthen the LiveIntent engineering team. The role requires strong backend depth combined with solid frontend expertise to build and scale high-performance, data-intensive systems.
Candidates are expected to demonstrate excellent analytical and problem-solving skills, along with strong system design capabilities for large-scale, distributed applications. Prior experience in AdTech or similar high-throughput domains is highly desirable.
Required Skills & Experience:
- 7–12 years of hands-on experience in full-stack development
- Strong proficiency in Python with Django (ORM, REST APIs, performance tuning)
- Solid experience with Angular (modern versions, component architecture)
- Hands-on experience with Docker and Kubernetes in production environments
- Strong understanding of MySQL, including query optimization and schema design
- Experience using Datadog for monitoring, metrics, and observability
- Excellent analytical, problem-solving, and debugging skills
- Proven experience in system design for scalable, distributed systems
Good to Haves:
- Experience with Node.js
- Strong background in database schema design and data modeling
- Prior experience working in AdTech / MarTech / digital advertising platforms
- Exposure to event-driven systems, real-time data pipelines, or high-volume traffic systems
- Experience with CI/CD pipelines and cloud platforms (AWS)
Key Responsibilities:
- Design, develop, and maintain scalable full-stack applications using Python (Django) and Angular
- Build and optimize backend services handling large data volumes and high request throughput
- Design and implement RESTful APIs with a focus on performance, security, and reliability
- Lead and contribute to system design discussions covering scalability, fault tolerance, and observability
- Containerize applications using Docker and deploy/manage workloads on Kubernetes
- Design, optimize, and maintain MySQL database schemas, queries, and indexes
- Implement monitoring, logging, and alerting using Datadog •
- Perform deep debugging and root-cause analysis of complex production issues
- Collaborate with product, platform, and data teams to deliver business-critical features
- Mentor junior engineers and promote engineering best practices
About the company:
Aptroid Consulting (India) Pvt Ltd is a Web Development company focused on helping marketers transforms the customer experience increasing engagement and driving revenue, customer data to inform and drive it in every interaction in real time and with each individual behavior possibly.
About the Role:
We’re looking for a Senior Java & PHP Developer to join our backend engineering team that powers high-throughput, large-scale SaaS platforms — delivering billions of personalized communications and marketing events every month. You’ll work on mission-critical services that drive automation, data intelligence, and real-time campaign delivery across global clients. You’ll play a key role in designing scalable APIs, improving platform performance, and mentoring developers — while working closely with distributed teams aligned to US (EST) time zones.
Key Responsibilities:
- Architect, design, and develop scalable backend services using Java (Spring Boot, Microservices) and PHP (Laravel/Symfony/custom frameworks).
- Lead system design and architecture reviews, ensuring clean code, maintainability, and adherence to best practices.
- Drive API integrations, microservices deployment, and modernization of legacy components.
- Collaborate with product managers, DevOps, and data engineering teams to deliver high impact features and performance improvements.
- Build, maintain, and monitor data-intensive systems that handle large message/event volumes with high reliability.
- Implement strong observability practices (metrics, tracing, alerting) and contribute to production incident reviews.
- Perform code reviews, mentor junior engineers, and advocate engineering excellence.
- Work collaboratively across global teams during EST business hours for sprint planning, releases, and incident response.
Required Qualifications:
- 6–9 years of professional backend development experience.
- Expert in Java (Spring Boot, REST APIs, concurrency, JVM tuning) and PHP (Laravel/Symfony, Composer, PSR standards)
- Strong experience with MySQL / PostgreSQL, Redis, and NoSQL systems. • Familiarity with AWS services (S3, Lambda, ECS/EKS, CloudWatch) and CI/CD pipelines (Jenkins, GitLab, GitHub Actions).
- Hands-on experience in scalable, distributed architectures and performance optimization.
- Strong debugging, profiling, and system performance tuning capabilities.
- Proven ability to deliver reliable, secure, and production-ready code in fast-paced agile environments.
- Excellent communication skills; able to coordinate with global teams across time zones.
Preferred Skills:
- Exposure to Kafka, RabbitMQ, or other event-driven systems.
- Familiarity with containerization (Docker) and orchestration (Kubernetes).
- Experience integrating with third-party APIs and external SDKs.
- Prior experience in AdTech, Martech, or high-volume SaaS platforms (similar to Sailthru/Zeta Global ecosystem).
- Knowledge of Python/Go for cross-service utilities or internal tooling is a plus.
What We Offer:
- Opportunity to work on high-scale enterprise systems with real-world impact.
- Exposure to global engineering practices and advanced cloud architectures.
- Collaborative culture with technical ownership and innovation freedom.
- Competitive compensation aligned with experience and global standards.
About Sun King
Sun King is the world’s leading off-grid solar energy company, providing affordable solar solutions to the 1.8 billion people without reliable access to electricity. By combining product design, fintech, and field operations, Sun King has connected over 20 million homes to solar power across Africa and Asia, adding more than 200,000 new homes each month. Through ‘pay-as-you-go’ financing, customers make small payments to eventually own their solar systems, saving money and reducing reliance on harmful energy sources like kerosene.
Sun King employs 2,800 staff across 12 countries, with expertise in product design, data science, logistics, customer service, and more. The company is expanding its product range to include clean cooking, electric mobility, and entertainment solutions, all while supporting a diverse workforce — with women making up 44% of the team.
About the role:
The role involves designing, executing, and maintaining robust functional, regression, and integration testing to ensure product quality and reliability, along with thorough defect tracking, analysis, and resolution. The individual will develop and maintain UI and API automation frameworks to improve test coverage, minimize manual effort, and enhance release efficiency. Close collaboration with development teams is expected to reproduce issues, validate fixes, and ensure high-quality releases. The role also includes integrating automated tests into CI/CD pipelines, supporting production issue analysis, and verifying hotfixes in live environments. Additionally, the candidate will actively participate in requirement and design reviews to ensure testability and clarity, maintain comprehensive QA documentation, and continuously improve testing frameworks, tools, and overall QA processes.
What you will be expected to do:
- Design, execute, and maintain test cases, test plans, and test scripts for functional, regression, and integration testing.
- Identify software defects, document them clearly, and track them through to closure.
- Analyze bugs and provide detailed insights to help developers understand root causes.
- Partner closely with the development team to reproduce issues, validate fixes, and ensure overall product quality.
- Develop, maintain, and improve automated test suites (API/UI) to enhance test coverage, reduce manual effort, and improve release confidence.
- Work with CI/CD pipelines to integrate automated tests into the deployment workflow.
- Validate production issues, support troubleshooting, and verify hotfixes in real-time environments.
- Recommend improvements in product performance, usability, and reliability based on test findings.
- Participate in requirement and design reviews to ensure clarity, completeness, and testability.
- Benchmark against competitor products and suggest enhancements based on industry trends.
- Maintain detailed test documentation, including test results, defect logs, and release readiness assessments.
- Continuously improve QA processes, automation frameworks, and testing methodologies.
You might be a strong candidate if you have/are:
- Bachelor’s Degree in Computer Science, Information Technology, or a related field.
- 2+ years of hands-on experience in software testing (manual + exposure to automation).
- Strong understanding of QA methodologies, testing types, and best practices.
- Experience in designing and executing test cases, test plans, and regression suites.
- Exposure to automation tools/frameworks such as Selenium, Playwright, Cypress, TestNG, JUnit, or similar.
- Basic programming or scripting knowledge (Java/Python preferred).
- Good understanding of SQL for backend and data validation testing.
- Familiarity with API testing tools such as Postman or RestAssured.
- Experience with defect tracking and test management tools (Jira, TestRail, etc.).
- Strong analytical and debugging skills with the ability to identify root causes.
- Ability to work effectively in Agile/Scrum environments and partner with developers, product, and DevOps teams.
- Strong ownership mindset — having contributed to high-quality, near bug-free releases.
Good to have:
- Exceptional attention to detail and a strong focus on product quality.
- Experience with performance, load, or security testing (JMeter, Gatling, OWASP tools, etc.).
- Exposure to advanced automation frameworks or building automation scripts from scratch.
- Familiarity with CI/CD pipelines and integrating automated tests.
- Experience working with observability tools like Grafana, Kibana, and Prometheus for production verification.
- Good understanding of microservices, distributed systems, or cloud platforms.
What Sun King offers:
- Professional growth in a dynamic, rapidly expanding, high-social-impact industry
- An open-minded, collaborative culture made up of enthusiastic colleagues who are driven by the challenge of innovation towards profound impact on people and the planet.
- A truly multicultural experience: you will have the chance to work with and learn from people from different geographies, nationalities, and backgrounds.
- Structured, tailored learning and development programs that help you become a better leader, manager, and professional through the Sun King Center for Leadership.
Position: Lead Python Developer
Location: Ahmedabad
The Client company includes a team of experienced information services professionals who are
passionate about growing and enhancing the value of information services businesses. They provide
support with talent, technology, tools, infrastructure and expertise required to deliver across the Data ecosystem.
Position Summary
We are seeking a skilled and experienced Backend Developer with strong expertise in TypeScript, Python,
and web scraping. You will be responsible for designing, developing, and maintaining scalable backend services and APIs that power our data-driven products. Your role will involve collaborating with cross-
functional teams, optimizing system performance, ensuring data integrity, and contributing to the design of efficient and secure architectures.
Job Responsibility
● Design, develop, and maintain backend systems and services using Python and TypeScript.
● Develop and maintain web scraping solutions to extract, process, and manage large-scale data from multiple sources.
● Work with relational and non-relational databases, ensuring high availability, scalability, and performance.
● Implement authentication, authorization, and security best practices across services.
● Write clean, maintainable, and testable code following best practices and coding standards.
● Collaborate with frontend engineers, data engineers, and DevOps teams to deliver robust solutions and troubleshoot, debug, and upgrade existing applications.
● Stay updated with backend development trends, tools, and frameworks to continuously improve processes.
● Utilize core crawling experience to design efficient strategies for scraping the data from different websites and applications.
● Collaborate with technology teams, data collection teams to build end to end technology-enabled ecosystems and partner in research projects to analyze the massive data inputs.
● Responsible for the design and development of web crawlers, able to independently solve various problems encountered in the actual development process.
● Stay updated with the latest web scraping techniques, tools, and industry trends to continuously improve the scraping processes.
Job Requirements
● 4+ years of professional experience in backend development with TypeScript and Python.
● Strong understanding of TypeScript-based server-side frameworks (e.g., Node.js, NestJS, Express) and Python frameworks (e.g., FastAPI, Django, Flask).
● Experience with tools and libraries for web scraping (e.g., Scrapy, BeautifulSoup, Selenium, Puppeteer)
● Hands-on experience with Temporal for creating and orchestrating workflows.
● Proven hands-on experience in web scraping, including crawling, data extraction, deduplication, and handling dynamic websites.
● Proficient in implementing proxy solutions and handling bot-detection challenges (e.g., Cloudflare).
● Experience working with Docker, containerized deployments, and cloud environments (GCP or Azure).
● Proficiency with database systems such as MongoDB and ElasticSearch.
● Hands-on experience with designing and maintaining scalable APIs.
● Knowledge of software testing practices (unit, integration, end-to-end).
● Familiarity with CI/CD pipelines and version control systems (Git).
● Strong problem-solving skills, attention to detail, and ability to work in agile environments.
● Great communication skills and ability to navigate in undirected situations.
Job Exposure:
● Opportunity to apply creative methods in acquiring and filtering the North American government, agencies data from various websites, sources
● In depth industry exposure on data harvesting techniques to build, scale the robust and sustainable model, using open-source applications
● Effectively collaboration with IT team to design the tailor-made solutions basis upon clients’ requirement
● Unique opportunity to research on various agencies, vendors, products as well as technology tools to compose a solution
. Power BI Developer
Job Id: QX005
About Us:
The QX Impact was launched with a mission to make A.I accessible and affordable and deliver AI Products/Solutions at scale for the enterprises by bringing the power of Data, AI, and Engineering to drive digital transformation. We believe without insights; businesses will continue to face challenges to better understand their customers and even lose them. Secondly, without insights businesses won't’ be able to deliver differentiated products/services; and finally, without insights, businesses can’t achieve a new level of “Operational Excellence” is crucial to remain competitive, meeting rising customer expectations, expanding markets, and digitalization.
Job Summary:
We are seeking a creative, collaborative, adaptable Power BI Developer to join our agile team of highly skilled data scientists, data engineers, and UX developers. The Data Visualization Specialist is responsible for turning abstract information from data analyses into appealing and understandable visualizations that improve business insights from the results of the analyses. He or she is a creative thinker who understands user interface design and applies visualizations skills such as user experience design, data visualization, and graphical design. The individual in this role understands how information is turned into knowledge and how this knowledge supports and enables key business processes.
Key Responsibilities:
· Design and Develop visualizations to manipulate complex datasets in simple, intuitive, interactive formats
· Develop data visualization techniques in developing business analytics and semantic data access requirements
· Translate business analytics needs into data visualization requirements, typically via iterative/agile prototyping
· Work closely with data engineers and data scientists to optimally design and implement semantic data consumption within data visualization environments
- Convert data into business insights using advanced visualization techniques to help with data-driven decision making and management reporting
- Should have the ability to interact with customers, interpret business requirements and design documents
- Should be able to blend, manipulate & transform data to create powerful/Interactive dashboards
- Should have experience in admin activities like publishing, adding users, creation of subscriptions and deployment
- Knowledge of databases, warehouses, business intelligence systems, Hadoop, Python, and other data analysis tools is good to have
· Should have the ability to write Advanced SQL queries & stored procedures, a strong understanding of relational and dimensional data models
Must Have:
· Bachelor’s degree (preferably in Computer Science, Mathematics, or a related technical discipline) or equivalent combination of education and experience.
· 5+ years of hands-on experience in designing and developing visualizations in Power BI.
· 3–4 years of experience with self-service BI tools (preferably Power BI).
· Strong knowledge and experience in SQL with solid understanding of relational databases and normalization.
· Proficiency in writing DAX queries in Power BI Desktop.
· Ability to implement row-level security and application security layer models in Power BI.
· Expertise in advanced-level calculations on datasets.
· 5+ years of experience in data preparation, data gateways, and data warehousing projects.
· Technical expertise in data modeling, data mart design, and data extraction from multiple sources.
· Strong knowledge of prototyping, designing, and requirement analysis.
· Excellent analytical, written, and oral communication skills.
· High attention to detail.
Good-to-Have:
· Experience with cloud and big data platforms (AWS, Azure, SQL Server, Data Warehouse, Azure Data Warehousing, Databricks, Big Data).
· Experience with Tableau (developing dashboards, administration, and architecture).
· Broader BI exposure beyond Power BI.
Competencies:
· Tech Savvy - Anticipating and adopting innovations in business-building digital and technology applications.
· Self-Development - Actively seeking new ways to grow and be challenged using both formal and informal development channels.
· Action Oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm.
· Customer Focus - Building strong customer relationships and delivering customer-centric solutions.
- Optimize Work Processes - Knowing the most effective and efficient processes to get things done, with a focus on continuous improvement.
Why Join Us?
- Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions.
- Work on impactful projects that make a difference across industries.
- Opportunities for professional growth and continuous learning.
- Competitive salary and benefits package.
Application Details
Ready to make an impact? Apply today and become part of the QX Impact team!
Position- Dotnet developer
Basic Requirements
- Basic knowledge of C#, .NET Core, SQL Server, and Entity Framework.
- Understanding of OOP concepts, MVC architecture, and database fundamentals.
- Ability to write simple SQL queries & work with stored procedures.
- Familiarity with frontend frameworks like Angular or React JS will be an added advantage.
- Awareness/learning mind set towards Clean Architecture or design patterns.
- Strong analytical and logical thinking.
- Good communication skills and willingness to learn from seniors/mentors.
- Passion for coding, problem-solving, and continuously upgrading technical skills.
- Flexible and adaptable, with eagerness to work in both independent and team-based environments.
Responsibilities and Duties
- Assist in writing clean, scalable code using .NET programming languages.
- Support in requirement understanding, coding, testing, and deployment.
- Debugging and fixing issues under guidance of senior developers.
- Collaborate with the team in design discussions and code reviews.
- Maintain proper documentation and follow coding standards.
- Learn and implement best practices in software development.
- Actively participate in training, knowledge-sharing, and self-improvement activities.
Benefits
5 days working (Monday to Friday)
Health Insurance
Open and friendly work culture.
Flat hierarchies and a great working environment.
Technically strong projects which enhance your skill and knowledge.
Good Mentoring with a caring approach.
Job Description:
Exp Range - [6y to 10y]
Qualifications:
- Minimum Bachelors Degree in Engineering or Computer Applications or AI/Data science
- Experience working in product companies/Startups for developing, validating, productionizing AI model in the recent projects in last 3 years.
- Prior experience in Python, Numpy, Scikit, Pandas, ETL/SQL, BI tools in previous roles preferred
Require Skills:
- Must Have – Direct hands-on experience working in Python for scripting automation analysis and Orchestration
- Must Have – Experience working with ML Libraries such as Scikit-learn, TensorFlow, PyTorch, Pandas, NumPy etc.
- Must Have – Experience working with models such as Random forest, Kmeans clustering, BERT…
- Should Have – Exposure to querying warehouses and APIs
- Should Have – Experience with writing moderate to complex SQL queries
- Should Have – Experience analyzing and presenting data with BI tools or Excel
- Must Have – Very strong communication skills to work with technical and non technical stakeholders in a global environment
Roles and Responsibilities:
- Work with Business stakeholders, Business Analysts, Data Analysts to understand various data flows and usage.
- Analyse and present insights about the data and processes to Business Stakeholders
- Validate and test appropriate AI/ML models based on the prioritization and insights developed while working with the Business Stakeholders
- Develop and deploy customized models on Production data sets to generate analytical insights and predictions
- Participate in cross functional team meetings and provide estimates of work as well as progress in assigned tasks.
- Highlight risks and challenges to the relevant stakeholders so that work is delivered in a timely manner.
- Share knowledge and best practices with broader teams to make everyone aware and more productive.
Role: Lead Data Engineer Core
Responsibilities: Lead end-to-end design, development, and delivery of complex cloud-based data pipelines.
Collaborate with architects and stakeholders to translate business requirements into technical data solutions.
Ensure scalability, reliability, and performance of data systems across environments. Provide mentorship and technical leadership to data engineering teams. Define and enforce best practices for data modeling, transformation, and governance.
Optimize data ingestion and transformation frameworks for efficiency and cost management. Contribute to data architecture design and review sessions across projects.
Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
8+ years of experience in data engineering with proven leadership in designing cloud native data systems.
Strong expertise in Python, SQL, Apache Spark, and at least one cloud platform (Azure, AWS, or GCP). Experience with Big Data, DataLake, DeltaLake, and Lakehouse architectures Proficient in one or more database technologies (e.g. PostgreSQL, Redshift, Snowflake, and NoSQL databases).
Ability to recommend and implement scalable data pipelines Preferred Qualifications: Cloud certification (AWS, Azure, or GCP). Experience with Databricks, Snowflake, or Terraform. Familiarity with data governance, lineage, and observability tools. Strong collaboration skills and ability to influence data-driven decisions across teams.
Your job: • Develop and maintain software components, including APIs and microservices
• Optimize backend systems on Microsoft Azure using App Services, Functions, and AzureSQL
• Contribute to frontend development as needed in a full-stack capacity
• Participate in code reviews, unit testing, and bug fixing to ensure high code quality
• Collaborate with the development team, product owner, and DevOps team in agile projects
• Maintain clear and comprehensive technical documentation for all feature and APIs
Your qualification:
• Master’s or bachelor’s degree in computer science
• 5 to 8yearsofexperienceinbackendwebapplicationdevelopment
• Expertise in backend technologies such as C#/.NET Core and in databases, including SQL and NoSQL (AzureSQL, Cosmos DB)
• Experience with Microsoft Azure services (App Services, Functions, SQL) and familiarity with frontend technologies (JavaScript/TypeScript and/ or Angular) would be an added advantage
• Proficiency in cloud-based backend development, full-stack development, and software optimization
• Experience with agile methodologies, unit testing, automated testing, and CI/CD pipelines would be beneficial • Excellent written and spoken English communications kills
Java Angular Fullstack Developer
Job Description:
Technical Lead – Full Stack
Experience: 8–12 years (Strong candidates Java 50% - Angular 50%)
Location – remote
Pf no is mandatory
Tech Stack: Java, Spring Boot, Microservices, Angular, SQL
Focus: Hands-on coding, solution design, team leadership, delivery ownership
Must-Have Skills (Depth)
Java (8+): Streams, concurrency, collections, JVM internals (GC), exception handling.
Spring Boot: Security, Actuator, Data/JPA, Feign/RestTemplate, validation, profiles, configuration management.
Microservices: API design, service discovery, resilience patterns (Hystrix/Resilience4j), messaging (Kafka/RabbitMQ) optional.
React: Hooks, component lifecycle, state management, error boundaries, testing (Jest/RTL).
SQL: Joins, aggregations, indexing, query optimization, transaction isolation, schema design.
Testing: JUnit/Mockito for backend; Jest/RTL/Cypress for frontend.
DevOps: Git, CI/CD, containers (Docker), familiarity with deployment environments.
Who we are:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI. We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions
Kanerika has won several awards over the years, including:
- Best Place to Work 2023 by Great Place to Work®
- Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
- NASSCOM Emerge 50 Award in 2014
- Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
- Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees. Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
Locations
We are located in Austin (USA), Singapore, Hyderabad, Indore and Ahmedabad (India).
Job Location: Hyderabad, Indore and Ahmedabad.
Role:
We are looking for A highly skilled Full Stack .NET Developer with strong hands-on experience in C#, .NET Core, ASP.NET Core, Web API, and Microservices Architecture. Proficient in developing scalable and high-performing applications using SQL Server, NoSQL databases, and Entity Framework (v6+). Recognized for excellent troubleshooting, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional and international teams, including US counterparts.
Technical Skills
- Programming Languages: C#, TypeScript, JavaScript
- Frameworks & Technologies: .NET Core, ASP.NET Core, Web API, Angular (v10+), Entity Framework (v6+), Microservices Architecture
- Databases: SQL Server, NoSQL
- Cloud Platform: Microsoft Azure
- Design & Architecture: OOPs Concepts, Design Patterns, Reusable Libraries, Microservices Implementation
- Front-End Development: Angular Material, HTML5, CSS3, Responsive UI Development
- Additional Skills: Excellent troubleshooting abilities, strong communication (verbal & written), and effective collaboration with US counterparts
What You’ll Bring:
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
- 2-5 years of experience.
- Proven experience delivering high-quality web applications.
Mandatory Skills
- Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture.
- Solid experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Applying Design Patterns.
- ·Strong proficiency in Angular framework (v10+ preferred)and TypeScript & Solid understanding of HTML5, CSS3, JavaScript
- Skill for writing reusable libraries & Experience with Angular Material or other UI component libraries
- Excellent Communication skills both oral & written.
- Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts
Preferred Skills (Nice to Have):
- Self – Starter with solid analytical and problem- solving skills.
- Willingness to work extra hours to meet deliverables.
- Understanding of Agile/Scrum Methodologies.
- Exposure to cloud platform like AWS/Azure
Why join us?
- Work with a passionate and innovative team in a fast-paced, growth-oriented environment.
- Gain hands-on experience in content marketing with exposure to real-world projects.
- Opportunity to learn from experienced professionals and enhance your marketing skills.
- Contribute to exciting initiatives and make an impact from day one.
- Competitive stipend and potential for growth within the company.
Employee Benefits
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
Who we are:
Kanerika Inc. is a premier global software products and services firm that specializes in providing innovative solutions and services for data-driven enterprises. Our focus is to empower businesses to achieve their digital transformation goals and maximize their business impact through the effective use of data and AI.
We leverage cutting-edge technologies in data analytics, data governance, AI-ML, GenAI/ LLM and industry best practices to deliver custom solutions that help organizations optimize their operations, enhance customer experiences, and drive growth.
Awards and Recognitions:
Kanerika has won several awards over the years, including:
1. Best Place to Work 2023 by Great Place to Work®
2. Top 10 Most Recommended RPA Start-Ups in 2022 by RPA Today
3. NASSCOM Emerge 50 Award in 2014
4. Frost & Sullivan India 2021 Technology Innovation Award for its Kompass composable solution architecture
5. Kanerika has also been recognized for its commitment to customer privacy and data security, having achieved ISO 27701, SOC2, and GDPR compliances.
Working for us:
Kanerika is rated 4.6/5 on Glassdoor, for many good reasons. We truly value our employees' growth, well-being, and diversity, and people’s experiences bear this out. At Kanerika, we offer a host of enticing benefits that create an environment where you can thrive both personally and professionally. From our inclusive hiring practices and mandatory training on creating a safe work environment to our flexible working hours and generous parental leave, we prioritize the well-being and success of our employees.
Our commitment to professional development is evident through our mentorship programs, job training initiatives, and support for professional certifications. Additionally, our company-sponsored outings and various time-off benefits ensure a healthy work-life balance. Join us at Kanerika and become part of a vibrant and diverse community where your talents are recognized, your growth is nurtured, and your contributions make a real impact. See the benefits section below for the perks you’ll get while working for Kanerika.
About the Role:
We are looking for A highly skilled Full Stack .NET Developer with strong hands-on experience in C#, .NET Core, ASP.NET Core, Web API, and Microservices Architecture. Proficient in developing scalable and high-performing applications using SQL Server, NoSQL databases, and Entity Framework (v6+). Recognized for excellent troubleshooting, problem-solving, and communication skills, with the ability to collaborate effectively with cross-functional and international teams, including US counterparts.
Technical Skills:
- Programming Languages: C#, TypeScript, JavaScript
- Frameworks & Technologies: .NET Core, ASP.NET Core, Web API, Angular (v10+), Entity Framework (v6+), Microservices Architecture
- Databases: SQL Server, NoSQL
- Cloud Platform: Microsoft Azure
- Design & Architecture: OOPs Concepts, Design Patterns, Reusable Libraries, Microservices Implementation
- Front-End Development: Angular Material, HTML5, CSS3, Responsive UI Development
- Additional Skills: Excellent troubleshooting abilities, strong communication (verbal & written), and effective collaboration with US counterparts
What You’ll Bring:
- Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent work experience.
- 6+ years of experience
- Proven experience delivering high-quality web applications.
Mandatory Skills:
- Strong hands-on experience on C#, SQL Server, OOPS Concepts, Micro Services Architecture.
- Solid experience on .NET Core, ASP.NET Core, Web API, SQL, No SQL, Entity Framework 6 or above, Azure, Applying Design Patterns. Strong proficiency in Angular framework (v10+ preferred)and TypeScript & Solid understanding of HTML5, CSS3, JavaScript
- Skill for writing reusable libraries & Experience with Angular Material or other UI component libraries
- Excellent Communication skills both oral & written.
- Excellent troubleshooting and communication skills, ability to communicate clearly with US counter parts
Preferred Skills (Nice to Have):
- Self – Starter with solid analytical and problem- solving skills. Willingness to work extra hours to meet deliverables
- Understanding of Agile/Scrum Methodologies.
- Exposure to cloud platform like AWS/Azure
Employee Benefits:
1. Culture:
- Open Door Policy: Encourages open communication and accessibility to management.
- Open Office Floor Plan: Fosters a collaborative and interactive work environment.
- Flexible Working Hours: Allows employees to have flexibility in their work schedules.
- Employee Referral Bonus: Rewards employees for referring qualified candidates.
- Appraisal Process Twice a Year: Provides regular performance evaluations and feedback.
2. Inclusivity and Diversity:
- Hiring practices that promote diversity: Ensures a diverse and inclusive workforce.
- Mandatory POSH training: Promotes a safe and respectful work environment.
3. Health Insurance and Wellness Benefits:
- GMC and Term Insurance: Offers medical coverage and financial protection.
- Health Insurance: Provides coverage for medical expenses.
- Disability Insurance: Offers financial support in case of disability.
4. Child Care & Parental Leave Benefits:
- Company-sponsored family events: Creates opportunities for employees and their families to bond.
- Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child.
- Family Medical Leave: Offers leave for employees to take care of family members' medical needs.
5. Perks and Time-Off Benefits:
- Company-sponsored outings: Organizes recreational activities for employees.
- Gratuity: Provides a monetary benefit as a token of appreciation.
- Provident Fund: Helps employees save for retirement.
- Generous PTO: Offers more than the industry standard for paid time off.
- Paid sick days: Allows employees to take paid time off when they are unwell.
- Paid holidays: Gives employees paid time off for designated holidays.
- Bereavement Leave: Provides time off for employees to grieve the loss of a loved one.
6. Professional Development Benefits:
- L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development.
- Mentorship Program: Offers guidance and support from experienced professionals.
- Job Training: Provides training to enhance job-related skills.
- Professional Certification Reimbursements: Assists employees in obtaining professional certifications.
- Promote from Within: Encourages internal growth and advancement opportunities.
Job Title: QA Tester – FinTech (Manual + Automation Testing)
Location: Bangalore, India
Job Type: Full-Time
Experience Required: 3 Years
Industry: FinTech / Financial Services
Function: Quality Assurance / Software Testing
About the Role:
We are looking for a skilled QA Tester with 3 years of experience in both manual and automation testing, ideally in the FinTech domain. The candidate will work closely with development and product teams to ensure that our financial applications meet the highest standards of quality, performance, and security.
Key Responsibilities:
- Analyze business and functional requirements for financial products and translate them into test scenarios.
- Design, write, and execute manual test cases for new features, enhancements, and bug fixes.
- Develop and maintain automated test scripts using tools such as Selenium, TestNG, or similar frameworks.
- Conduct API testing using Postman, Rest Assured, or similar tools.
- Perform functional, regression, integration, and system testing across web and mobile platforms.
- Work in an Agile/Scrum environment and actively participate in sprint planning, stand-ups, and retrospectives.
- Log and track defects using JIRA or a similar defect management tool.
- Collaborate with developers, BAs, and DevOps teams to improve quality across the SDLC.
- Ensure test coverage for critical fintech workflows like transactions, KYC, lending, payments, and compliance.
- Assist in setting up CI/CD pipelines for automated test execution using tools like Jenkins, GitLab CI, etc.
Required Skills and Experience:
- 3+ years of hands-on experience in manual and automation testing.
- Solid understanding of QA methodologies, STLC, and SDLC.
- Experience in testing FinTech applications such as digital wallets, online banking, investment platforms, etc.
- Strong experience with Selenium WebDriver, TestNG, Postman, and JIRA.
- Knowledge of API testing, including RESTful services.
- Familiarity with SQL to validate data in databases.
- Understanding of CI/CD processes and basic scripting for automation integration.
- Good problem-solving skills and attention to detail.
- Excellent communication and documentation skills.
Preferred Qualifications:
- Exposure to financial compliance and regulatory testing (e.g., PCI DSS, AML/KYC).
- Experience with mobile app testing (iOS/Android).
- Working knowledge of test management tools like TestRail, Zephyr, or Xray.
- Performance testing experience (e.g., JMeter, LoadRunner) is a plus.
- Basic knowledge of version control systems (e.g., Git).
Job Details
- Job Title: Lead I - Data Engineering
- Industry: Global digital transformation solutions provider
- Domain - Information technology (IT)
- Experience Required: 6-9 years
- Employment Type: Full Time
- Job Location: Pune
- CTC Range: Best in Industry
Job Description
Job Title: Senior Data Engineer (Kafka & AWS)
Responsibilities:
- Develop and maintain real-time data pipelines using Apache Kafka (MSK or Confluent) and AWS services.
- Configure and manage Kafka connectors, ensuring seamless data flow and integration across systems.
- Demonstrate strong expertise in the Kafka ecosystem, including producers, consumers, brokers, topics, and schema registry.
- Design and implement scalable ETL/ELT workflows to efficiently process large volumes of data.
- Optimize data lake and data warehouse solutions using AWS services such as Lambda, S3, and Glue.
- Implement robust monitoring, testing, and observability practices to ensure reliability and performance of data platforms.
- Uphold data security, governance, and compliance standards across all data operations.
Requirements:
- Minimum of 5 years of experience in Data Engineering or related roles.
- Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
- Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
- Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
- Excellent problem-solving, communication, and collaboration skills.
- Flexibility to write production-quality code in both Python and Java as required.
Skills: Aws, Kafka, Python
Must-Haves
Minimum of 5 years of experience in Data Engineering or related roles.
Proven expertise with Apache Kafka and the AWS data stack (MSK, Glue, Lambda, S3, etc.).
Proficient in coding with Python, SQL, and Java — with Java strongly preferred.
Experience with Infrastructure-as-Code (IaC) tools (e.g., CloudFormation) and CI/CD pipelines.
Excellent problem-solving, communication, and collaboration skills.
Flexibility to write production-quality code in both Python and Java as required.
Skills: Aws, Kafka, Python
Notice period - 0 to 15days only
ROLES AND RESPONSIBILITIES:
You will be responsible for architecting, implementing, and optimizing Dremio-based data Lakehouse environments integrated with cloud storage, BI, and data engineering ecosystems. The role requires a strong balance of architecture design, data modeling, query optimization, and governance enablement in large-scale analytical environments.
- Design and implement Dremio lakehouse architecture on cloud (AWS/Azure/Snowflake/Databricks ecosystem).
- Define data ingestion, curation, and semantic modeling strategies to support analytics and AI workloads.
- Optimize Dremio reflections, caching, and query performance for diverse data consumption patterns.
- Collaborate with data engineering teams to integrate data sources via APIs, JDBC, Delta/Parquet, and object storage layers (S3/ADLS).
- Establish best practices for data security, lineage, and access control aligned with enterprise governance policies.
- Support self-service analytics by enabling governed data products and semantic layers.
- Develop reusable design patterns, documentation, and standards for Dremio deployment, monitoring, and scaling.
- Work closely with BI and data science teams to ensure fast, reliable, and well-modeled access to enterprise data.
IDEAL CANDIDATE:
- Bachelor’s or Master’s in Computer Science, Information Systems, or related field.
- 5+ years in data architecture and engineering, with 3+ years in Dremio or modern lakehouse platforms.
- Strong expertise in SQL optimization, data modeling, and performance tuning within Dremio or similar query engines (Presto, Trino, Athena).
- Hands-on experience with cloud storage (S3, ADLS, GCS), Parquet/Delta/Iceberg formats, and distributed query planning.
- Knowledge of data integration tools and pipelines (Airflow, DBT, Kafka, Spark, etc.).
- Familiarity with enterprise data governance, metadata management, and role-based access control (RBAC).
- Excellent problem-solving, documentation, and stakeholder communication skills.
PREFERRED:
- Experience integrating Dremio with BI tools (Tableau, Power BI, Looker) and data catalogs (Collibra, Alation, Purview).
- Exposure to Snowflake, Databricks, or BigQuery environments.
- Experience in high-tech, manufacturing, or enterprise data modernization programs.
An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.
Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.
Key Responsibilities
- Advanced Troubleshooting & Incident Management:
- Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
- Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
- Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
- Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
- Python-Specific Tasks:
- Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
- Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
- Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
- Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
- Collaboration and Escalation:
- Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
- Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
- Documentation and Process Improvement:
- Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
- Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
- Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
- Customer Communication:
- Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.
Required Technical Skills
- Programming/Scripting:
- Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
- Experience with other scripting languages like Bash or Shell
- Databases:
- Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
- Application/Web Technologies:
- Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
- Knowledge of application architectures (e.g., microservices, SOA) is a plus.
- Monitoring & Tools:
- Experience with support ticketing systems (e.g., JIRA, ServiceNow).
- Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)
An L2 Technical Support Engineer with Python knowledge is responsible for handling escalated, more complex technical issues that the Level 1 (L1) support team cannot resolve. Your primary goal is to perform deep-dive analysis, troubleshooting, and problem resolution to minimize customer downtime and ensure system stability.
Python is a key skill, used for scripting, automation, debugging, and data analysis in this role.
Key Responsibilities
- Advanced Troubleshooting & Incident Management:
- Serve as the escalation point for complex technical issues (often involving software bugs, system integrations, backend services, and APIs) that L1 support cannot resolve.
- Diagnose, analyze, and resolve problems, often requiring in-depth log analysis, code review, and database querying.
- Own the technical resolution of incidents end-to-end, adhering strictly to established Service Level Agreements (SLAs).
- Participate in on-call rotation for critical (P1) incident support outside of regular business hours.
- Python-Specific Tasks:
- Develop and maintain Python scripts for automation of repetitive support tasks, system health checks, and data manipulation.
- Use Python for debugging and troubleshooting by analyzing application code, API responses, or data pipeline issues.
- Write ad-hoc scripts to extract, analyze, or modify data in databases for diagnostic or resolution purposes.
- Potentially apply basic-to-intermediate code fixes in Python applications in collaboration with development teams.
- Collaboration and Escalation:
- Collaborate closely with L3 Support, Software Engineers, DevOps, and Product Teams to report bugs, propose permanent fixes, and provide comprehensive investigation details.
- Escalate issues that require significant product changes or deeper engineering expertise to the L3 team, providing clear, detailed documentation of all steps taken.
- Documentation and Process Improvement:
- Conduct Root Cause Analysis (RCA) for major incidents, documenting the cause, resolution, and preventative actions.
- Create and maintain a Knowledge Base (KB), runbooks, and Standard Operating Procedures (SOPs) for recurring issues to empower L1 and enable customer self-service.
- Proactively identify technical deficiencies in processes and systems and recommend improvements to enhance service quality.
- Customer Communication:
- Maintain professional, clear, and timely communication with customers, explaining complex technical issues and resolutions in an understandable manner.
Required Technical Skills
- Programming/Scripting:
- Strong proficiency in Python (for scripting, automation, debugging, and data manipulation).
- Experience with other scripting languages like Bash or Shell
- Databases:
- Proficiency in SQL for complex querying, debugging data flow issues, and data extraction.
- Application/Web Technologies:
- Understanding of API concepts (RESTful/SOAP) and experience troubleshooting them using tools like Postman or curl.
- Knowledge of application architectures (e.g., microservices, SOA) is a plus.
- Monitoring & Tools:
- Experience with support ticketing systems (e.g., JIRA, ServiceNow).
- Familiarity with log aggregation and monitoring tools (Kibana, Splunk, ELK Stack, Grafana)
Role: Full-Time, Long-Term Required: Python, SQL Preferred: Experience with financial or crypto data
OVERVIEW
We are seeking a data engineer to join as a core member of our technical team. This is a long-term position for someone who wants to build robust, production-grade data infrastructure and grow with a small, focused team. You will own the data layer that feeds our machine learning pipeline—from ingestion and validation through transformation, storage, and delivery.
The ideal candidate is meticulous about data quality, thinks deeply about failure modes, and builds systems that run reliably without constant attention. You understand that downstream ML models are only as good as the data they consume.
CORE TECHNICAL REQUIREMENTS
Python (Required): Professional-level proficiency. You write clean, maintainable code for data pipelines—not throwaway scripts. Comfortable with Pandas, NumPy, and their performance characteristics. You know when to use Python versus push computation to the database.
SQL (Required): Advanced SQL skills. Complex queries, query optimization, schema design, execution plans. PostgreSQL experience strongly preferred. You think about indexing, partitioning, and query performance as second nature.
Data Pipeline Design (Required): You build pipelines that handle real-world messiness gracefully. You understand idempotency, exactly-once semantics, backfill strategies, and incremental versus full recomputation tradeoffs. You design for failure—what happens when an upstream source is late, returns malformed data, or goes down entirely. Experience with workflow orchestration required: Airflow, Prefect, Dagster, or similar.
Data Quality (Required): You treat data quality as a first-class concern. You implement validation checks, anomaly detection, and monitoring. You know the difference between data that is missing versus data that should not exist. You build systems that catch problems before they propagate downstream.
WHAT YOU WILL BUILD
Data Ingestion: Pipelines pulling from diverse sources—crypto exchanges, traditional market feeds, on-chain data, alternative data. Handling rate limits, API quirks, authentication, and source-specific idiosyncrasies.
Data Validation: Checks ensuring completeness, consistency, and correctness. Schema validation, range checks, freshness monitoring, cross-source reconciliation.
Transformation Layer: Converting raw data into clean, analysis-ready formats. Time series alignment, handling different frequencies and timezones, managing gaps.
Storage and Access: Schema design optimized for both write patterns (ingestion) and read patterns (ML training, feature computation). Data lifecycle and retention management.
Monitoring and Alerting: Observability into pipeline health. Knowing when something breaks before it affects downstream systems.
DOMAIN EXPERIENCE
Preference for candidates with experience in financial or crypto data—understanding market data conventions, exchange-specific quirks, and point-in-time correctness. You know why look-ahead bias is dangerous and how to prevent it.
Time series data at scale—hundreds of symbols with years of history, multiple frequencies, derived features. You understand temporal joins, windowed computations, and time-aligned data challenges.
High-dimensional feature stores—we work with hundreds of thousands of derived features. Experience managing, versioning, and serving large feature sets is valuable.
ENGINEERING STANDARDS
Reliability: Pipelines run unattended. Failures are graceful with clear errors, not silent corruption. Recovery is straightforward.
Reproducibility: Same inputs and code version produce identical outputs. You version schemas, track lineage, and can reconstruct historical states.
Documentation: Schemas, data dictionaries, pipeline dependencies, operational runbooks. Others can understand and maintain your systems.
Testing: You write tests for pipelines—validation logic, transformation correctness, edge cases. Untested pipelines are broken pipelines waiting to happen.
TECHNICAL ENVIRONMENT
PostgreSQL, Python, workflow orchestration (flexible on tool), cloud infrastructure (GCP preferred but flexible), Git.
WHAT WE ARE LOOKING FOR
Attention to Detail: You notice when something is slightly off and investigate rather than ignore.
Defensive Thinking: You assume sources will send bad data, APIs will fail, schemas will change. You build accordingly.
Self-Direction: You identify problems, propose solutions, and execute without waiting to be told.
Long-Term Orientation: You build systems you will maintain for years.
Communication: You document clearly, explain data issues to non-engineers, and surface problems early.
EDUCATION
University degree in a quantitative/technical field preferred: Computer Science, Mathematics, Statistics, Engineering. Equivalent demonstrated expertise also considered.
TO APPLY
Include: (1) CV/resume, (2) Brief description of a data pipeline you built and maintained, (3) Links to relevant work if available, (4) Availability and timezone.
SimplyFI is a fast-growing AI- and blockchain-powered product company transforming trade finance and banking through digital innovation. We build scalable, intelligent platforms that simplify complex financial workflows for enterprises and financial institutions.
We are looking for a Full Stack Tech Lead with strong expertise in ReactJS (primary) and solid working knowledge of Python (secondary) to join our team in Thane, Mumbai.
Role: Full Stack Tech Lead (ReactJS + Python)
Key Responsibilities:
- Design, develop, and maintain scalable full-stack applications, with ReactJS as the primary frontend technology
- Build and integrate backend services using Python (Flask / Django / FastAPI)
- Design and manage RESTful APIs for internal and external system integrations
- Collaborate on AI-driven product features and support machine-learning model integrations when required
- Work closely with DevOps teams to deploy, monitor, and optimize applications on AWS
- Ensure performance, scalability, security, and code quality across the application stack
- Collaborate with product managers, designers, and QA teams to deliver high-quality features
- Write clean, maintainable, and testable code following engineering best practices
- Participate in agile processes, including code reviews, sprint planning, and daily stand-ups
Required Skills & Qualifications:
- Strong hands-on experience with ReactJS, including hooks, state management, Redux, and API integrations
- Proficiency in backend development using Python (Flask, Django, or FastAPI)
- Solid understanding of RESTful API design and secure authentication mechanisms (OAuth2, JWT)
- Experience working with databases such as MySQL, PostgreSQL, and MongoDB
- Familiarity with microservices architecture and modern software design patterns
- Hands-on experience with Git, CI/CD pipelines, Docker, and Kubernetes
- Strong problem-solving, debugging, and performance optimization skills
Job Responsibilities :
- Work closely with product managers and other cross functional teams to help define, scope and deliver world-class products and high quality features addressing key user needs.
- Translate requirements into system architecture and implement code while considering performance issues of dealing with billions of rows of data and serving millions of API requests every hour.
- Ability to take full ownership of the software development lifecycle from requirement to release.
- Writing and maintaining clear technical documentation enabling other engineers to step in and deliver efficiently.
- Embrace design and code reviews to deliver quality code.
- Play a key role in taking Trendlyne to the next level as a world-class engineering team
-Develop and iterate on best practices for the development team, ensuring adherence through code reviews.
- As part of the core team, you will be working on cutting-edge technologies like AI products, online backtesting, data visualization, and machine learning.
- Develop and maintain scalable, robust backend systems using Python and Django framework.
- Proficient understanding of the performance of web and mobile applications.
- Mentor junior developers and foster skill development within the team.
Job Requirements :
- 1+ years of experience with Python and Django.
- Strong understanding of relational databases like PostgreSQL or MySQL and Redis.
- (Optional) : Experience with web front-end technologies such as JavaScript, HTML, and CSS
Who are we :
Trendlyne, is a Series-A products startup in the financial markets space with cutting-edge analytics products aimed at businesses in stock markets and mutual funds.
Our founders are IIT + IIM graduates, with strong tech, analytics, and marketing experience. We have top finance and management experts on the Board of Directors.
What do we do :
We build powerful analytics products in the stock market space that are best in class. Organic growth in B2B and B2C products have already made the company profitable. We deliver 900 million+ APIs every month to B2B customers. Trendlyne analytics deals with 100s of millions rows of data to generate insights, scores, and visualizations which are an industry benchmark.
Job Responsibilities:
- Develop features across multiple sub-modules within our applications, including collaboration in requirements definition, prototyping, design, coding, testing, debugging, effort estimation, and continuous quality improvement of the design & code and deployment.
- Design and implement new features, provide fixes/workarounds to bugs, and innovate in alternate solutions.
- Provide quick solutions to problems and take a feature/component through the entire life cycle, improving space–time performance and usability/reliability.
- Design, implement, and adhere to the overall architecture to fulfill the functional requirements through software components.
- Take accountability for the successful delivery of functionality or modules contributing to the overall product objective.
- Create consistent design specifications using flowcharts, class diagrams, Entity Relationship Diagrams (ERDs), and other visual techniques to convey the development approach to the lead developer and other stakeholders.
- Conduct source code walkthroughs, refactoring, and ensure adherence to documentation standards.
- Support troubleshooting efforts in production systems and fulfill support requests from developers.
Experience and Skills:
- Bachelor’s degree in Computer Science or similar technical discipline required; Master’s degree preferred.
- Strong experience as a software engineer with demonstrated success developing a variety of software systems and increasing responsibility in analysis, design, implementation, and deployment tasks with a reputed software product company.
- Hands-on experience in product development using Java 8, J2EE, Spring Boot, Spring MVC, JSF, REST API, JSON, SQL Server, PostgreSQL, Oracle, Redis Cache, Amber, JavaScript/jQuery.
- Good to have experience in Handlebars.js, Flyway, PrimeFaces.
- Experience developing data-driven applications utilizing major relational database engines (SQL Server, Oracle, DB2) including writing complex queries, stored procedures, and performing query optimization.
- Experience building web-based software systems with N-tier architectures, dynamic content, scalable solutions, and complex security implementations.
- Strong understanding of Design Patterns, system architecture, and configurations for enterprise web applications.
- Exposure to development environments such as Eclipse, GitHub/Bitbucket.
- Comfortable with source code management concepts (version control).
- Self-motivated, energetic, fast learner with excellent communication skills (interaction with remote teams required).
- Experience with Agile software development is a plus.
Travel: Based on business needs.
Location: Gurgaon
Job Title: Jira, Confluence, and Bitbucket Administrator
Location: Mumbai, India (candidate must be willing to attend onsite interviews in Malad, Mumbai)
Job Type: Full-time
Experience: 3–4 years (seeking immediate joiners or candidates available within 15–30 days and Local profiles are preferred)
Key Responsibilities:
· Administer, configure, and maintain Jira, Confluence, and Bitbucket environments, ensuring optimal performance and reliability.
· Work closely with cross-functional teams to gather requirements and deliver effective solutions using Atlassian tools
· Implement and manage user access controls, roles, and permissions within Jira, Confluence, and Bitbucket.
· Collaborate with development and project teams to gather requirements and provide solutions using Jira workflows and Confluence documentation.
· Create and maintain custom scripts using Groovy for automation, improvements, and enhancements across the Atlassian suite.
· Develop and implement project management features, dashboards, and reports to support various stakeholders.
· Troubleshoot and resolve issues related to Jira, Confluence, and Bitbucket, providing timely support to users.
· Conduct training sessions and workshops to inform users of best practices and new features in the Atlassian tools.
· Stay up-to-date with new releases and features from Atlassian and evaluate their applicability to our processes.
Qualifications:
· Bachelor's degree in Computer Science, Information Technology, or a related field.
· 3-4 years of experience in administering Jira, Confluence, and Bitbucket in a corporate environment.
· Proficiency in Groovy scripting for customizing and automating Atlassian products.
· Strong analytical and problem-solving skills.
· Excellent communication and collaboration abilities.
· Familiarity with Agile methodologies and project management principles.
· Experience with other development tools and practices is a plus.
· Required Skills
· 3-4 years of experience in administering Jira, Confluence, and Bitbucket in a corporate environment.
· Proficiency in Groovy scripting for customizing and automating Atlassian products. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Familiarity with Agile methodologies and project management principles.
· Experience with other development tools and practices is a plus.
We are seeking a motivated Data Analyst to support business operations by analyzing data, preparing reports, and delivering meaningful insights. The ideal candidate should be comfortable working with data, identifying patterns, and presenting findings in a clear and actionable way.
Key Responsibilities:
- Collect, clean, and organize data from internal and external sources
- Analyze large datasets to identify trends, patterns, and opportunities
- Prepare regular and ad-hoc reports for business stakeholders
- Create dashboards and visualizations using tools like Power BI or Tableau
- Work closely with cross-functional teams to understand data requirements
- Ensure data accuracy, consistency, and quality across reports
- Document data processes and analysis methods
We’re looking for a Program Manager-1 to join our Growth team- someone who thrives in fast- paced environments and can turn user insights into measurable impact. You’ll work across product and business functions to drive growth, optimize funnels, and enhance the user journey.
What You’ll Do
- Own parts of the user journey and drive improvements across acquisition, activation, and retention funnels.
- Partner with Product, Marketing, Engineering, and Design teams to identify growth opportunities and execute data-backed experiments.
- Use data and user insights to pin point drop-offs and design solutions that improve conversion.
- Build, track, and measure growth metrics and KPIs.
- Bring structure and clarity to ambiguous problems and drive alignment across teams.
- Stay on top of product trends and best practices to inspire new growth ideas.
What We’re Looking For
- Graduate from a Tier 1 institute (IITs, IIMs, ISB, BITS, etc.)
- 2 - 2.5 years of experience, preferably in a B2C startup(not early-stage).
- Exposure to digital products or services is a plus.
- Experience working closely with product and business teams. Strong analytical skills and structured thinking
What You’ll Do:
We are looking for a Staff Software Engineer based in Pune, India who can master both DeepIntent’s data architectures and pharma research and analytics methodologies to make significant contributions to how health media is analyzed by our clients. This role requires an Engineer who not only understands DBA functions but also how they impact research objectives and can work with researchers and data scientists to achieve impactful results.
This role will be in the Analytics Organization and will require integration and partnership with the Engineering Organization. The ideal candidate is a self-starter who is inquisitive who is not afraid to take on and learn from challenges and will constantly seek to improve the facets of the business they manage. The ideal candidate will also need to demonstrate the ability to collaborate and partner with others.
- Serve as the Engineering interface between Analytics and Engineering teams.
- Develop and standardize all interface points for analysts to retrieve and analyze data with a focus on research methodologies and data-based decision-making.
- Optimize queries and data access efficiencies, serve as an expert in how to most efficiently attain desired data points.
- Build “mastered” versions of the data for Analytics-specific querying use cases.
- Help with data ETL, table performance optimization.
- Establish a formal data practice for the Analytics practice in conjunction with the rest of DeepIntent
- Build & operate scalable and robust data architectures.
- Interpret analytics methodology requirements and apply them to data architecture to create standardized queries and operations for use by analytics teams.
- Implement DataOps practices.
- Master existing and new Data Pipelines and develop appropriate queries to meet analytics-specific objectives.
- Collaborate with various business stakeholders, software engineers, machine learning engineers, and analysts.
- Operate between Engineers and Analysts to unify both practices for analytics insight creation.
Who You Are:
- 8+ years of experience in Tech Support (Specialised in Monitoring and maintaining Data pipeline).
- Adept in market research methodologies and using data to deliver representative insights.
- Inquisitive, curious, understands how to query complicated data sets, move and combine data between databases.
- Deep SQL experience is a must.
- Exceptional communication skills with the ability to collaborate and translate between technical and non-technical needs.
- English Language Fluency and proven success working with teams in the U.S.
- Experience in designing, developing and operating configurable Data pipelines serving high-volume and velocity data.
- Experience working with public clouds like GCP/AWS.
- Good understanding of software engineering, DataOps, and data architecture, Agile and DevOps methodologies.
- Experience building Data architectures that optimize performance and cost, whether the components are prepackaged or homegrown.
- Proficient with SQL, Python or JVM-based language, Bash.
- Experience with any of Apache open-source projects such as Spark, Druid, Beam, Airflow etc. and big data databases like BigQuery, Clickhouse, etc.
- Ability to think big, take bets and innovate, dive deep, hire and develop the best talent, learn and be curious.
What you'll be doing:
As a Software Developer at Trential, you will be the bridge between technical strategy and hands-on execution. You will be working with our dedicated engineering team designing, building, and deploying our core platforms and APIs. You will ensure our solutions are scalable, secure, interoperable, and aligned with open standards and our core vision. Build and maintain back-end interfaces using modern frameworks.
- Design & Implement: Lead the design, implementation and management of Trential’s products.
- Code Quality & Best Practices: Enforce high standards for code quality, security, and performance through rigorous code reviews, automated testing, and continuous delivery pipelines.
- Standards Adherence: Ensure all solutions comply with relevant open standards like W3C Verifiable Credentials (VCs), Decentralized Identifiers (DIDs) & Privacy Laws, maintaining global interoperability.
- Continuous Improvement: Lead the charge to continuously evaluate and improve the products & processes. Instill a culture of metrics-driven process improvement to boost team efficiency and product quality.
- Cross-Functional Collaboration: Work closely with the Co-Founders & Product Team to translate business requirements and market needs into clear, actionable technical specifications and stories. Represent Trential in interactions with external stakeholders for integrations.
What we're looking for:
- 3+ years of experience in backend development.
- Deep proficiency in JavaScript, Node.js experience in building and operating distributed, fault tolerant systems.
- Hands-on experience with cloud platforms (AWS & GCP) and modern DevOps practices (e.g., CI/CD, Infrastructure as Code, Docker).
- Strong knowledge of SQL/NoSQL databases and data modeling for high-throughput, secure applications.
Preferred Qualifications (Nice to Have)
- Knowledge of decentralized identity principles, Verifiable Credentials (W3C VCs), DIDs, and relevant protocols (e.g., OpenID4VC, DIDComm)
- Familiarity with data privacy and security standards (GDPR, SOC 2, ISO 27001) and designing systems complying to these laws.
- Experience integrating AI/ML models into verification or data extraction workflows.
Job Description
Wissen Technology is seeking an experienced C# .NET Developer to build and maintain applications related to streaming market data. This role involves developing message-based C#/.NET applications to process, normalize, and summarize large volumes of market data efficiently. The candidate should have a strong foundation in Microsoft .NET technologies and experience working with message-driven, event-based architecture. Knowledge of capital markets and equity market data is highly desirable.
Responsibilities
- Design, develop, and maintain message-based C#/.NET applications for processing real-time and batch market data feeds.
- Build robust routines to download and process data from AWS S3 buckets on a frequent schedule.
- Implement daily data summarization and data normalization routines.
- Collaborate with business analysts, data providers, and other developers to deliver high-quality, scalable market data solutions.
- Troubleshoot and optimize market data pipelines to ensure low latency and high reliability.
- Contribute to documentation, code reviews, and team knowledge sharing.
Required Skills and Experience
- 5+ years of professional experience programming in C# and Microsoft .NET framework.
- Strong understanding of message-based and real-time programming architectures.
- Experience working with AWS services, specifically S3, for data retrieval and processing.
- Experience with SQL and Microsoft SQL Server.
- Familiarity with Equity market data, FX, Futures & Options, and capital markets concepts.
- Excellent interpersonal and communication skills.
- Highly motivated, curious, and analytical mindset with the ability to work well both independently and in a team environment.
Job Description
Role: Data Analyst
Experience: 6 - 9 Years
Location: Hyderabad
WorkMode: Work from Office (5 Days)
Overview
We are seeking a highly skilled Data Analyst with 6+ years of experience in analytics, data modeling, and advanced SQL. The ideal candidate has strong expertise in building scalable data models using dbt, writing efficient Python scripts, and delivering high-quality insights that support data-driven decision-making.
Key Responsibilities
Design, develop, and maintain data models using dbt (Core and dbt Cloud).
Build and optimize complex SQL queries to support reporting, analytics, and data pipelines.
Write Python scripts for data transformation, automation, and analytics workflows.
Ensure data quality, integrity, and consistency across multiple data sources.
Collaborate with cross-functional teams (Engineering, Product, Business) to understand data needs.
Develop dashboards and reports to visualize insights (using tools such as Tableau, Looker, or Power BI).
Perform deep-dive exploratory analysis to identify trends, patterns, and business opportunities.
Document data models, pipelines, and processes.
Contribute to scaling the analytics stack and improving data architecture.
Required Qualifications
6 - 9 years of hands-on experience in data analytics or data engineering.
Expert-level skills in SQL (complex joins, window functions, performance tuning).
Strong experience building and maintaining dbt data models.
Proficiency in Python for data manipulation, scripting, and automation.
Solid understanding of data warehousing concepts (e.g., dimensional modeling, ELT/ETL pipelines).
Understanding with cloud data platforms (Snowflake, BigQuery, Redshift, etc.).
Strong analytical thinking and problem-solving skills.
Excellent communication skills with the ability to present insights to stakeholders.
Trino and lakehouse architecture experience good to have
Skills required:
- Strong expertise in .NET Core / ASP.NET MVC
- Candidate must have 4+ years of experience in Dot Net.
- Candidate must have experience with Angular.
- Hands-on experience with Entity Framework & LINQ
- Experience with SQL Server (performance tuning, stored procedures, indexing)
- Understanding of multi-tenancy architecture
- Experience with Microservices / API development (REST, GraphQL)
- Hands-on experience in Azure Services (App Services, Azure SQL, Blob Storage, Key Vault, Functions, etc.)
- Experience in CI/CD pipelines with Azure DevOps
- Knowledge of security best practices in cloud-based applications
- Familiarity with Agile/Scrum methodologies
- Flexible to use copilot or any other AI tool to write automated test cases and faster code writing
Roles and Responsibilities:
- Good communication Skills is must.
- Develop features across multiple subsystems within our applications, including collaboration in requirements definition, prototyping, design, coding, testing, and deployment.
- Understand how our applications operate, are structured, and how customers use them
- Provide engineering support (when necessary) to our technical operations staff when they are building, deploying, configuring, and supporting systems for customers.
About Nuware
NuWare is a global technology and IT services company built on the belief that organizations require transformational strategies to scale, grow and build into the future owing to a dynamically evolving ecosystem. We strive towards our clients’ success in today’s hyper-competitive market by servicing their needs with next-gen technologies - AI/ML, NLP, chatbots, digital and automation tools.
We empower businesses to enhance their competencies, processes and technologies to fully leverage opportunities and accelerate impact. Through our focus on market differentiation and innovation - we offer services that are agile, streamlined, efficient and customer-centric.
Headquartered in Iselin, NJ, NuWare has been creating business value and generating growth opportunities for clients through its network of partners, global resources, highly skilled talent and SME’s for 25 years. NuWare is technology agnostic and offers services for Systems Integration, Cloud, Infrastructure Management, Mobility, Test automation, Data Sciences and Social & Big Data Analytics.
Skills Required
- Automation testing with UFT, strong into SQL, Good communication skills
- 5 years of experience in automation testing
- Experience with UFT for at least 3 years
- Good knowledge of VB Scripting
- Knowledge of Manual testing
- Knowledge of automation frameworks
Required Skills: CI/CD Pipeline, Kubernetes, SQL Database, Excellent Communication & Stakeholder Management, Python
Criteria:
Looking for 15days and max 30 days of notice period candidates.
looking candidates from Hyderabad location only
Looking candidates from EPAM company only
1.4+ years of software development experience
2. Strong experience with Kubernetes, Docker, and CI/CD pipelines in cloud-native environments.
3. Hands-on with NATS for event-driven architecture and streaming.
4. Skilled in microservices, RESTful APIs, and containerized app performance optimization.
5. Strong in problem-solving, team collaboration, clean code practices, and continuous learning.
6. Proficient in Python (Flask) for building scalable applications and APIs.
7. Focus: Java, Python, Kubernetes, Cloud-native development
8. SQL database
Description
Position Overview
We are seeking a skilled Developer to join our engineering team. The ideal candidate will have strong expertise in Java and Python ecosystems, with hands-on experience in modern web technologies, messaging systems, and cloud-native development using Kubernetes.
Key Responsibilities
- Design, develop, and maintain scalable applications using Java and Spring Boot framework
- Build robust web services and APIs using Python and Flask framework
- Implement event-driven architectures using NATS messaging server
- Deploy, manage, and optimize applications in Kubernetes environments
- Develop microservices following best practices and design patterns
- Collaborate with cross-functional teams to deliver high-quality software solutions
- Write clean, maintainable code with comprehensive documentation
- Participate in code reviews and contribute to technical architecture decisions
- Troubleshoot and optimize application performance in containerized environments
- Implement CI/CD pipelines and follow DevOps best practices
Required Qualifications
- Bachelor's degree in Computer Science, Information Technology, or related field
- 4+ years of experience in software development
- Strong proficiency in Java with deep understanding of web technology stack
- Hands-on experience developing applications with Spring Boot framework
- Solid understanding of Python programming language with practical Flask framework experience
- Working knowledge of NATS server for messaging and streaming data
- Experience deploying and managing applications in Kubernetes
- Understanding of microservices architecture and RESTful API design
- Familiarity with containerization technologies (Docker)
- Experience with version control systems (Git)
Skills & Competencies
- Skills Java (Spring Boot, Spring Cloud, Spring Security)
- Python (Flask, SQL Alchemy, REST APIs)
- NATS messaging patterns (pub/sub, request/reply, queue groups)
- Kubernetes (deployments, services, ingress, ConfigMaps, Secrets)
- Web technologies (HTTP, REST, WebSocket, gRPC)
- Container orchestration and management
- Soft Skills Problem-solving and analytical thinking
- Strong communication and collaboration
- Self-motivated with ability to work independently
- Attention to detail and code quality
- Continuous learning mindset
- Team player with mentoring capabilities
Profile: Senior Data Engineer (Informatica MDM)
Primary Purpose:
The Senior Data Engineer will be responsible for building new segments in a Customer Data Platform (CDP), maintaining the segments, understanding the data requirements for use cases, data integrity, data quality and data sources involved to build the specific use cases. The resource should also have an understanding of ETL processes. This position will have an understanding of integrations with cloud service providers like Microsoft Azure, Azure Data Lake Services, Azure Data Factory and cloud data warehouse platforms in addition to Enterprise Data Ware house environments. The ideal candidate will also have proven experience in data analysis and management, with excellent analytical and problem-solving abilities.
Major Functions/Responsibilities
• Design, develop and implement robust and extensible solutions to build segmentations using Customer Data Platform.
• Work closely with subject matter experts to identify and document based on the business requirements, functional specs and translate them into appropriate technical solutions.
• Responsible for estimating, planning, and managing the user stories, tasks and reports on Agile Projects.
• Develop advanced SQL Procedures, Functions and SQL jobs.
• Performance tuning and optimization of ETL Jobs, SQL Queries and Scripts.
• Configure and maintain scheduled ETL jobs, data segments and refresh.
• Support exploratory data analysis, statistical analysis, and predictive analytics.
• Support production issues and maintain existing data systems by researching and trouble shooting any issues/problems in a timely manner.
• Proactive, great attention to detail, results-oriented problem solver.
Preferred Experience
• 6+ years of experience in writing SQL queries and stored procedures to extract, manipulate and load data.
• 6+ years’ experience with design, build, test, and maintain data integrations for data marts and data warehouses.
• 3+ years of experience in integrations Azure / AWS Data Lakes, Azure Data Factory & IDMC (Informatica Cloud Services).
• In depth understanding of database management systems, online analytical processing (OLAP) and ETL (Extract, transform, load) framework.
• Excellent verbal and written communication skills
• Collaboration with both onshore and offshore development teams.
• Good Understanding of Marketing tools like Sales Force Marketing cloud, Adobe Marketing or Microsoft Customer Insights Journey and Customer Data Platform will be important to this role. Communication
• Facilitate project team meetings effectively.
• Effectively communicate relevant project information to superiors
• Deliver engaging, informative, well-organized presentations that are effectively tailored to the intended audience.
• Serve as a technical liaison with development partner.
• Serve as a communication bridge between applications team, developers and infrastructure team members to facilitate understanding of current systems
• Resolve and/or escalate issues in a timely fashion.
• Understand how to communicate difficult/sensitive information tactfully.
• Works under the direction of Technical Data Lead / Data architect. Education
• Bachelor’s Degree or higher in Engineering, Technology or related field experience required.
Review Criteria
- Strong Data Scientist/Machine Learnings/ AI Engineer Profile
- 2+ years of hands-on experience as a Data Scientist or Machine Learning Engineer building ML models
- Strong expertise in Python with the ability to implement classical ML algorithms including linear regression, logistic regression, decision trees, gradient boosting, etc.
- Hands-on experience in minimum 2+ usecaseds out of recommendation systems, image data, fraud/risk detection, price modelling, propensity models
- Strong exposure to NLP, including text generation or text classification (Text G), embeddings, similarity models, user profiling, and feature extraction from unstructured text
- Experience productionizing ML models through APIs/CI/CD/Docker and working on AWS or GCP environments
- Preferred (Company) – Must be from product companies
Job Specific Criteria
- CV Attachment is mandatory
- What's your current company?
- Which use cases you have hands on experience?
- Are you ok for Mumbai location (if candidate is from outside Mumbai)?
- Reason for change (if candidate has been in current company for less than 1 year)?
- Reason for hike (if greater than 25%)?
Role & Responsibilities
- Partner with Product to spot high-leverage ML opportunities tied to business metrics.
- Wrangle large structured and unstructured datasets; build reliable features and data contracts.
- Build and ship models to:
- Enhance customer experiences and personalization
- Boost revenue via pricing/discount optimization
- Power user-to-user discovery and ranking (matchmaking at scale)
- Detect and block fraud/risk in real time
- Score conversion/churn/acceptance propensity for targeted actions
- Collaborate with Engineering to productionize via APIs/CI/CD/Docker on AWS.
- Design and run A/B tests with guardrails.
- Build monitoring for model/data drift and business KPIs
Ideal Candidate
- 2–5 years of DS/ML experience in consumer internet / B2C products, with 7–8 models shipped to production end-to-end.
- Proven, hands-on success in at least two (preferably 3–4) of the following:
- Recommender systems (retrieval + ranking, NDCG/Recall, online lift; bandits a plus)
- Fraud/risk detection (severe class imbalance, PR-AUC)
- Pricing models (elasticity, demand curves, margin vs. win-rate trade-offs, guardrails/simulation)
- Propensity models (payment/churn)
- Programming: strong Python and SQL; solid git, Docker, CI/CD.
- Cloud and data: experience with AWS or GCP; familiarity with warehouses/dashboards (Redshift/BigQuery, Looker/Tableau).
- ML breadth: recommender systems, NLP or user profiling, anomaly detection.
- Communication: clear storytelling with data; can align stakeholders and drive decisions.
Review Criteria
- Strong Implementation Manager / Customer Success Implementation / Technical Solutions / Post-Sales SaaS Delivery
- 3+ years of hands-on experience in software/tech Implementation roles within technical B2B SaaS companies, preferably working with global or US-based clients
- Must have direct experience leading end-to-end SaaS product implementations — including onboarding, workflow configuration, API integrations, data setup, and customer training
- Must have strong technical understanding — including ability to read and write basic SQL queries, debug API workflows, and interpret JSON payloads for troubleshooting or configuration validation.
- Must have worked in post-sales environments, owning customer success and delivery after deal closure, ensuring product adoption, accurate setup, and smooth go-live.
- Must have experience collaborating cross-functionally with product, engineering, and sales teams to ensure timely resolution of implementation blockers and seamless client onboarding.
- (Company): B2B SaaS startup or growth-stage company
- Mandatory (Note): Good growth opportunity, this role will have team leading option after a few months
Preferred
- Preferred (Experience): Previous experience in FinTech SaaS like BillingTech, finance automation, or subscription management platforms will be a strong plus
Job Specific Criteria
- CV Attachment is mandatory
- Are you open to work in US timings (4/5:00 PM - 3:00 AM) - to target the US market?
- Please provide CTC Breakup (Fixed + Variable)?
- It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?
- It’s a hybrid role with 1-3 work from office (Indiranagar) with in office hours 3:00 pm to 10:00 om IST, are you ok with hybrid mode?
Role & Responsibilities
As the new hire in this role, you'll be the voice of the customer in the company, and lead the charge in developing our customer-centric approach, working closely with our tech, design, and product teams.
What you will be doing:
You will be responsible for converting, onboarding, managing, and proactively ensuring success for our customers/prospective clients.
- Implementation
- Understand client billing models and configure company contracts, pricing, metering, and invoicing accurately.
- Lead pilots and implementation for new customers, ensuring complete onboarding within 3–8 weeks.
- Translate complex business requirements into structured company workflows and setup.
- Pre-sales & Technical Discovery
- Support sales with live demos, sandbox setups, and RFP responses.
- Participate in technical discovery calls to map company capabilities to client needs.
- Create and maintain demo environments showcasing relevant use cases.
- Internal Coordination & Escalation
- Act as the voice of the customer internally — share structured feedback with product and engineering.
- Create clear, well-scoped handoff documents when working with technical teams.
- Escalate time-sensitive issues appropriately and follow through on resolution.
- Documentation & Enablement
- Create client-specific documentation (e.g., onboarding guides, configuration references).
- Contribute to internal wikis, training material, and product documentation.
- Write simple, to-the-point communication — clear enough for a CXO and detailed enough for a developer.
Ideal Candidate
- 3-7 years of relevant experience
- Willing to work in US time zone (~430 am IST) on weekdays (Mon-Fri)
- Ability to understand and shape the product at a granular level
- Ability to empathize with the customers, and understand their pain points
- Understanding of SaaS architecture and APIs conceptually — ability to debug API workflows and usage issues
- Previous experience in salesforce CRM
- Entrepreneurial drive, and willingness to wear multiple hats as per company’s requirements
- Strong analytical skills and a structured problem-solving approach
- (Strongly preferred) Computer science background and basic coding experience
- Ability to understand functional aspects related to the product e.g., accounting/revenue recognition, receivables, billing etc
- Self-motivated and proactive in managing tasks and responsibilities, requiring minimal follow-ups.
- Self-driven individual with high ownership and strong work ethic
- Not taking yourself too seriously.
Job Description
We are looking for motivated IT professionals with at least one year of industry experience. The ideal candidate should have hands-on experience in AWS, Azure, AI, or Cloud technologies, or should be enthusiastic and ready to upskill and shift to new and emerging technologies. This role is primarily remote; however, candidates may be required to visit the office occasionally for meetings or project needs.
Key Requirements
- Minimum 1 year of experience in the IT industry
- Exposure to AWS / Azure / AI / Cloud platforms (any one or more)
- Willingness to learn and adapt to new technologies
- Strong problem-solving and communication skills
- Ability to work independently in a remote setup
- Must have a proper work-from-home environment (laptop, stable internet, quiet workspace)
Education Qualification
- B.Tech / BE / MCA / M.Sc (IT) / equivalent
Role Overview
We are looking for a passionate Software Engineer with 1–3 years of hands-on experience in backend engineering to join our team in Mumbai. The ideal candidate will have strong programming skills in GoLang, a solid understanding of SQL databases, and exposure to or interest in High Performance Computing (HPC) concepts. You will be responsible for designing, developing, optimizing, and maintaining backend services that are scalable, efficient, and secure.
Key Responsibilities
- Develop, build, and maintain backend services and microservices using GoLang
- Design and optimize database schemas and write efficient SQL queries for relational databases
- Work on high-performance applications by optimizing code, memory usage, and execution speed
- Collaborate with cross-functional teams including frontend, DevOps, QA, and product
- Participate in code reviews, troubleshoot production issues, and follow best engineering practices
- Contribute to improving system performance, reliability, and scalability
- Stay up to date with emerging backend technologies, tools, and frameworks
Required Skills
Technical Skills
- 1–5 years of experience in backend development
- Strong hands-on experience with GoLang (Golang)
- Good understanding of SQL and relational database design
- Exposure to or understanding of HPC concepts such as concurrency, parallelism, distributed processing, or performance optimization
- Experience with RESTful APIs and microservice architectures
- Familiarity with version control systems (Git)
Soft Skills
- Strong analytical and problem-solving abilities
- Ability to work effectively in a fast-paced, collaborative team environment
- Good communication and documentation skills
- Strong ownership mindset with a willingness to learn
Good to Have
- Experience with cloud platforms (AWS, Azure, or GCP)
- Knowledge of Docker or other containerization tools
- Understanding of CI/CD pipelines
- Experience with performance profiling and monitoring tools
Education
- Bachelor’s degree in Computer Science, Engineering, or a related technical field
Why Join Oneture Technologies?
- Opportunity to work on high-impact, modern technology projects
- Learning-driven culture with strong mentorship and continuous upskilling
- Exposure to cloud-native and cutting-edge backend technologies
- Collaborative, startup-like environment with real ownership of projects
Functional Testing & Validation
- Web Application Testing: Design, document, and execute comprehensive functional test plans and test cases for complex, highly interactive web applications, ensuring they meet specified requirements and provide an excellent user experience.
- Backend API Testing: Possess deep expertise in validating backend RESTful and/or SOAP APIs. This includes testing request/response payloads, status codes, data integrity, security, and robust error handling mechanisms.
- Data Validation with SQL: Write and execute complex SQL queries (joins, aggregations, conditional logic) to perform backend data checks, verify application states, and ensure data integrity across integration points.
- I Automation (Playwright & TypeScript):
- Design, develop, and maintain robust, scalable, and reusable UI automation scripts using Playwright and TypeScript.
- Integrate automation suites into Continuous Integration/Continuous Deployment (CI/CD) pipelines.
- Implement advanced automation patterns and frameworks (e.g., Page Object Model) to enhance maintainability.
- Prompt-Based Automation: Demonstrate familiarity or hands-on experience with emerging AI-driven or prompt-based automation approaches and tools to accelerate test case generation and execution.
- API Automation: Develop and maintain automated test suites for APIs to ensure reliability and performance.
3. Performance & Load Testing
- JMeter Proficiency: Utilize Apache JMeter to design, script, and execute robust API load testing and stress testing scenarios.
- Analyse performance metrics, identify bottlenecks (e.g., response time, throughput), and provide actionable reports to development teams.
🛠️ Required Skills and Qualifications
- Experience: 4+ years of professional experience in Quality Assurance and Software Testing, with a strong focus on automation.
- Automation Stack: Expert-level proficiency in developing and maintaining automation scripts using Playwright and TypeScript.
- Testing Tools: Proven experience with API testing tools (e.g., Postman, Swagger) and strong functional testing methodologies.
- Database Skills: Highly proficient in writing and executing complex SQL queries for data validation and backend verification.
- Performance: Hands-on experience with Apache JMeter for API performance and load testing.
- Communication: Excellent communication and collaboration skills to work effectively with cross-functional teams (Developers, Product Managers).
- Problem-Solving: Strong analytical and debugging skills to efficiently isolate and report defects.
Job Description – SEO Specialist
Company: Capace Software Pvt. Ltd.
Location: Bhopal / Bangalore (On-site)
Experience: 2+ Years
Budget: Up to ₹4 LPA
Position: Full-Time
About the Role
Capace Software Pvt. Ltd. is looking for a skilled SEO Specialist with strong expertise in On-Page SEO, Off-Page SEO, and Technical SEO. The ideal candidate will be responsible for improving our search engine ranking, driving organic traffic, and ensuring technical search requirements are met across websites.
Key Responsibilities
🔹 On-Page SEO
- Optimize meta titles, descriptions, header tags, and URLs
- Conduct in-depth keyword research and implement strategic keyword placement
- Optimize website content for relevancy and readability
- Implement internal linking strategies
- Optimize images, schema, and site structure for SEO
- Ensure webpages follow SEO best practices
🔹 Off-Page SEO
- Create and execute backlink strategies
- Manage directory submissions, social bookmarking, classified listings
- Conduct competitor backlink analysis
- Build high-quality guest post links and outreach
- Improve brand visibility through digital promotions
🔹 Technical SEO
- Conduct website audits (crawl errors, index issues, technical fixes)
- Optimize website speed and performance
- Implement schema markup and structured data
- Manage XML sitemaps and robots.txt
- Resolve indexing, crawling, and canonical issues
- Work with developers to implement technical updates
Requirements
- Minimum 2+ years of experience in SEO
- Strong knowledge of On-Page, Off-Page & Technical SEO
- Experience with tools like:
- Google Analytics
- Google Search Console
- Ahrefs / SEMrush / Ubersuggest
- Screaming Frog (good to have)
- Understanding of HTML, CSS basics (preferred)
- Strong analytical and reporting skills
- Good communication and documentation skills
What We Offer
- Competitive salary up to ₹4 LPA
- Opportunity to work on multiple SaaS products and websites
- Supportive team & learning-focused environment
- Career growth in digital marketing & SEO domain
Role Overview
We are seeking an experienced Python Backend Developer with strong expertise in SDK development, API design, and application security. The ideal candidate will build robust backend systems, integrate third-party services, and ensure secure, scalable backend operations.
Key Responsibilities
- Design, develop, and maintain backend services using Python and modern frameworks (e.g., FastAPI, Django, Flask).
- Build and maintain SDKs to support internal and external integrations.
- Develop clean, scalable, and reusable RESTful and/or GraphQL APIs.
- Implement and enforce security best practices, including authentication, authorization, encryption, secrets management, and OWASP guidelines.
- Collaborate with frontend, DevOps, and product teams to deliver end-to-end features.
- Integrate external APIs and third-party services efficiently and securely.
- Optimize backend performance, scalability, logging, and monitoring.
- Write automated tests and maintain high code quality through CI/CD pipelines.
- Work with client SMEs to understand existing workflows, formulas, rules, and translate them into maintainable backend services
· Consume and work with existing data models and database schemas (SQL/NoSQL) to support analytical workflows, operational planning applications, and integration of machine learning outputs into backend services.
· Leverage Redis (or similar in-memory stores) for caching and performance optimization, ensuring fast response times for data-driven APIs and applications.
· Utilize middleware, message queues, and streaming technologies (e.g., Kafka, Event Hubs, RabbitMQ) to build reliable, scalable data flows and event-driven backend services.
Required Skills & Qualifications
- Bachelor’s or Master’s degree in Computer Science, Artificial Intelligence, Software Engineering, Data Science or a related field
- Proven experience of 5+ years as a Python Developer specializing in backend systems.
- Hands-on experience with SDK design, development, and documentation.
- Strong knowledge of API development (REST, GraphQL), API versioning, and standards.
- Strong understanding of data modeling, multi-source data integration (SQL/NoSQL/warehouse), and analytical data flows.
- Solid understanding of application security, including:
- OAuth2, JWT, API keys
- Secure coding practices
- Data privacy & encryption
- Security testing & vulnerability mitigation
- Experience with Python frameworks such as FastAPI, Django, Flask.
- Knowledge of databases (PostgreSQL, MySQL, MongoDB, Redis).
- Familiarity with CI/CD, Git, Docker, Kubernetes and cloud platforms (AWS, GCP, Azure).
- Experience with caching (Redis), asynchronous processing, and performance tuning for low-latency user interactions.
- Knowledge of message brokers (Kafka, Event Hubs, RabbitMQ) and event-driven architecture for workflow orchestration.
- Strong analytical skills with complex Excel models, including familiarity with advanced formulas, pivot tables, and user-defined Excel functions
Preferred Qualifications
- Experience building public or enterprise-level SDKs.
- Hands-on experience with event-driven architectures, message queues, or streaming technologies
- Familiarity with workflow orchestration tools (e.g., Airflow, Prefect, Dagster, Azure Data Factory)
- Familiarity with data warehousing or analytical query optimization (Snowflake, BigQuery, Synapse, Redshift).
- Exposure to MLOps tools like MLflow, BentoML, Seldon, SageMaker, Vertex AI, or Databricks ML.
Competencies:
· Tech Savvy - Anticipating and adopting innovations in business-building digital and technology applications.
· Self-Development - Actively seeking new ways to grow and be challenged using both formal and informal development channels.
· Action Oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm.
· Customer Focus - Building strong customer relationships and delivering customer-centric solutions.
· Optimize Work Processes - Knowing the most effective and efficient processes to get things done, with a focus on continuous improvement.
Why Join Us?
- Be part of a collaborative and agile team driving cutting-edge AI and data engineering solutions.
- Work on impactful projects that make a difference across industries.
- Opportunities for professional growth and continuous learning.
- Competitive salary and benefits package.
Application Details
Ready to make an impact? Apply today and become part of the QX Impact team!
Job Summary:
We are seeking a highly skilled and self-driven Java Backend Developer with strong experience in designing and deploying scalable microservices using Spring Boot and Azure Cloud. The ideal candidate will have hands-on expertise in modern Java development, containerization, messaging systems like Kafka, and knowledge of CI/CD and DevOps practices.Key Responsibilities:
- Design, develop, and deploy microservices using Spring Boot on Azure cloud platforms.
- Implement and maintain RESTful APIs, ensuring high performance and scalability.
- Work with Java 11+ features including Streams, Functional Programming, and Collections framework.
- Develop and manage Docker containers, enabling efficient development and deployment pipelines.
- Integrate messaging services like Apache Kafka into microservice architectures.
- Design and maintain data models using PostgreSQL or other SQL databases.
- Implement unit testing using JUnit and mocking frameworks to ensure code quality.
- Develop and execute API automation tests using Cucumber or similar tools.
- Collaborate with QA, DevOps, and other teams for seamless CI/CD integration and deployment pipelines.
- Work with Kubernetes for orchestrating containerized services.
- Utilize Couchbase or similar NoSQL technologies when necessary.
- Participate in code reviews, design discussions, and contribute to best practices and standards.
Required Skills & Qualifications:
- Strong experience in Java (11 or above) and Spring Boot framework.
- Solid understanding of microservices architecture and deployment on Azure.
- Hands-on experience with Docker, and exposure to Kubernetes.
- Proficiency in Kafka, with real-world project experience.
- Working knowledge of PostgreSQL (or any SQL DB) and data modeling principles.
- Experience in writing unit tests using JUnit and mocking tools.
- Experience with Cucumber or similar frameworks for API automation testing.
- Exposure to CI/CD tools, DevOps processes, and Git-based workflows.
Nice to Have:
- Azure certifications (e.g., Azure Developer Associate)
- Familiarity with Couchbase or other NoSQL databases.
- Familiarity with other cloud providers (AWS, GCP)
- Knowledge of observability tools (Prometheus, Grafana, ELK)
Soft Skills:
- Strong problem-solving and analytical skills.
- Excellent verbal and written communication.
- Ability to work in an agile environment and contribute to continuous improvement.
Why Join Us:
- Work on cutting-edge microservice architectures
- Strong learning and development culture
- Opportunity to innovate and influence technical decisions
- Collaborative and inclusive work environment
Dear Candidate
Candidate must have:
- Minimum 3-5 years of experience working as a NOC Engineer / Senior NOC Engineer in the telecom/Product (preferably telecom monitoring) industry.
- BE in CS, EE, or Telecommunications from a recognized university.
- Knowledge of NOC Process
- Technology exposure towards Telecom – 5G,4G,IMS with a solid understanding of Telecom Performance KPI’s, and/or Radio Access Network. Knowledge of call flows will be advantage
- Experience with Linux OS and SQL – mandatory.
- Residence in Delhi – mandatory.
- Ready to work in a 24×7 environment.
- Ability to monitor alarms based on our environment.
- Capability to identify and resolve issues occurring in the RADCOM environment.
- Any relevant technical certification will be an added advantage.
Responsibilities:
- Based in RADCOM India offices, Delhi.
- Responsible for all NOC Monitoring and technical support -T1/T2 aspects required by the process for RADCOM’s solutions.
- Ready to participate under Customer Planned activities / execution and monitoring.



























