
About Wibmo
About
Connect with the team
Similar jobs
We are looking for a Technical Lead - GenAI with a strong foundation in Python, Data Analytics, Data Science or Data Engineering, system design, and practical experience in building and deploying Agentic Generative AI systems. The ideal candidate is passionate about solving complex problems using LLMs, understands the architecture of modern AI agent frameworks like LangChain/LangGraph, and can deliver scalable, cloud-native back-end services with a GenAI focus.
Key Responsibilities :
- Design and implement robust, scalable back-end systems for GenAI agent-based platforms.
- Work closely with AI researchers and front-end teams to integrate LLMs and agentic workflows into production services.
- Develop and maintain services using Python (FastAPI/Django/Flask), with best practices in modularity and performance.
- Leverage and extend frameworks like LangChain, LangGraph, and similar to orchestrate tool-augmented AI agents.
- Design and deploy systems in Azure Cloud, including usage of serverless functions, Kubernetes, and scalable data services.
- Build and maintain event-driven / streaming architectures using Kafka, Event Hubs, or other messaging frameworks.
- Implement inter-service communication using gRPC and REST.
- Contribute to architectural discussions, especially around distributed systems, data flow, and fault tolerance.
Required Skills & Qualifications :
- Strong hands-on back-end development experience in Python along with Data Analytics or Data Science.
- Strong track record on platforms like LeetCode or in real-world algorithmic/system problem-solving.
- Deep knowledge of at least one Python web framework (e.g., FastAPI, Flask, Django).
- Solid understanding of LangChain, LangGraph, or equivalent LLM agent orchestration tools.
- 2+ years of hands-on experience in Generative AI systems and LLM-based platforms.
- Proven experience with system architecture, distributed systems, and microservices.
- Strong familiarity with Any Cloud infrastructure and deployment practices.
- Should know about any Data Engineering or Analytics expertise (Preferred) e.g. Azure Data Factory, Snowflake, Databricks, ETL tools Talend, Informatica or Power BI, Tableau, Data modelling, Datawarehouse development.
CTC Budget: 35-50LPA
Location: Hyderabad/Bangalore
Experience: 8+ Years
Company Overview:
An 8-year-old IT Services and consulting company based in Hyderabad providing services in maximizing product value while delivering rapid incremental innovation, possessing extensive SaaS company M&A experience including 20+ closed transactions on both the buy and sell sides. They have over 100 employees and looking to grow the team.
Work with, learn from, and contribute to a diverse, collaborative
development team
● Use plenty of PHP, Go, JavaScript, MySQL, PostgreSQL, ElasticSearch,
Redshift, AWS Services and other technologies
● Build efficient and reusable abstractions and systems
● Create robust cloud-based systems used by students globally at scale
● Experiment with cutting edge technologies and contribute to the
company’s product roadmap
● Deliver data at scale to bring value to clients Requirements
You will need:
● Experience working with a server side language in a full-stack environment
● Experience with various database technologies (relational, nosql,
document-oriented, etc) and query concepts in high performance
environments
● Experience in one of these areas: React, Backbone
● Understanding of ETL concepts and processes
● Great knowledge of design patterns and back end architecture best
practices
● Sound knowledge of Front End basics like JavaScript, HTML, CSS
● Experience with TDD, automated testing
● 12+ years’ experience as a developer
● Experience with Git or Mercurial
● Fluent written & spoken English
It would be great if you have:
● B.Sc or M.Sc degree in Software Engineering, Computer Science or similar
● Experience and/or interest in API Design
● Experience with Symfony and/or Doctrine
● Experience with Go and Microservices
● Experience with message queues e.g. SQS, Kafka, Kinesis, RabbitMQ
● Experience working with a modern Big Data stack
● Contributed to open source projects
● Experience working in an Agile environment
POST - SENIOR DATA ENGINEER WITH AWS
Experience : 5 years
Must-have:
• Highly skilled in Python and PySpark
• Have expertise in writing Glue jobs ETL script, AWS
• Experience in working with Kafka
• Extensive SQL DB experience – Postgres
Good-to-have:
• Experience in working with data analytics and modelling
• Hands on Experience of PowerBI visualization tool
• Knowledge and hands-on on version control system - Git Common:
• Excellent communication and presentation skills (written and verbal) to all levels
of an organization
• Should be results oriented with ability to prioritize and drive multiple initiatives to
complete work you're doing on time
• Proven ability to influence a diverse geographically dispersed group of
individuals to facilitate, moderate, and influence productive design and implementation
discussions driving towards results
Shifts - Flexible ( might have to work as per US Shift timings for meetings ).
Employment Type - Any
RequiredSkills:
• Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. • 2-3 years of experience inExtraction, Transformation and Loading ETLwork using data migration tools like Talend, Informatica, Datastage, etc. • 4-6 years of experience as a database developerinOracle, MS SQLor another enterprise database with a focus on building data integration process • Candidate should haveanyNoSqltechnology exposure preferably MongoDB. • Experience in processing large data volumes indicated by experience with BigDataplatforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.). • Understanding of data warehousing concepts and decision support systems.
• Ability to deal with sensitive and confidential material and adhere to worldwide data security and • Experience writing documentation for design and feature requirements. • Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure etc. • Excellent communication and collaboration skills.
Job description Position: Data Engineer Experience: 6+ years Work Mode: Work from Office Location: Bangalore Please note: This position is focused on development rather than migration. Experience in Nifi or Tibco is mandatory.Mandatory Skills: ETL, DevOps platform, Nifi or Tibco We are seeking an experienced Data Engineer to join our team. As a Data Engineer, you will play a crucial role in developing and maintaining our data infrastructure and ensuring the smooth operation of our data platforms. The ideal candidate should have a strong background in advanced data engineering, scripting languages, cloud and big data technologies, ETL tools, and database structures.
Responsibilities: • Utilize advanced data engineering techniques, including ETL (Extract, Transform, Load), SQL, and other advanced data manipulation techniques. • Develop and maintain data-oriented scripting using languages such as Python. • Create and manage data structures to ensure efficient and accurate data storage and retrieval. • Work with cloud and big data technologies, specifically AWS and Azure stack, to process and analyze large volumes of data. • Utilize ETL tools such as Nifi and Tibco to extract, transform, and load data into various systems. • Have hands-on experience with database structures, particularly MSSQL and Vertica, to optimize data storage and retrieval. • Manage and maintain the operations of data platforms, ensuring data availability, reliability, and security. • Collaborate with cross-functional teams to understand data requirements and design appropriate data solutions. • Stay up-to-date with the latest industry trends and advancements in data engineering and suggest improvements to enhance our data infrastructure.
Requirements: • A minimum of 6 years of relevant experience as a Data Engineer. • Proficiency in ETL, SQL, and other advanced data engineering techniques. • Strong programming skills in scripting languages such as Python. • Experience in creating and maintaining data structures for efficient data storage and retrieval. • Familiarity with cloud and big data technologies, specifically AWS and Azure stack. • Hands-on experience with ETL tools, particularly Nifi and Tibco. • In-depth knowledge of database structures, including MSSQL and Vertica. • Proven experience in managing and operating data platforms. • Strong problem-solving and analytical skills with the ability to handle complex data challenges. • Excellent communication and collaboration skills to work effectively in a team environment. • Self-motivated with a strong drive for learning and keeping up-to-date with the latest industry trends.
Criteria:
- BE/MTech/MCA/MSc
- 3+yrs Hands on Experience in TSQL / PL SQL / PG SQL or NOSQL
- Immediate joiners preferred*
- Candidates will be selected based on logical/technical and scenario-based testing
Note: Candidates who have attended the interview process with TnS in the last 6 months will not be eligible.
Job Description:
- Technical Skills Desired:
- Experience in MS SQL Server and one of these Relational DB’s, PostgreSQL / AWS Aurora DB / MySQL / Oracle / NOSQL DBs (MongoDB / DynamoDB / DocumentDB) in an application development environment and eagerness to switch
- Design database tables, views, indexes
- Write functions and procedures for Middle Tier Development Team
- Work with any front-end developers in completing the database modules end to end (hands-on experience in parsing of JSON & XML in Stored Procedures would be an added advantage).
- Query Optimization for performance improvement
- Design & develop SSIS Packages or any other Transformation tools for ETL
- Functional Skills Desired:
- Banking / Insurance / Retail domain would be a
- Interaction with a client a
3. Good to Have Skills:
- Knowledge in a Cloud Platform (AWS / Azure)
- Knowledge on version control system (SVN / Git)
- Exposure to Quality and Process Management
- Knowledge in Agile Methodology
- Soft skills: (additional)
- Team building (attitude to train, work along, mentor juniors)
- Communication skills (all kinds)
- Quality consciousness
- Analytical acumen to all business requirements
Think out-of-box for business solution
Be a part of IOT Product portfolio and execute towards Digital Transformational initiatives.
Prepare design documents in collaboration with product managers and engineering squads
in development of use cases for new features.
Hands on product lead developer expertise in designing solutions running on hybrid cloud
environments.
Work as a Software Lead in application development using Java, JavaScript, Python, SQL and
other latest technologies running on AWS environments.
Drive Engineering activities in Microservices and Cloud based Architecture by leveraging
DevOps efficiencies and adopting new technology stack in AWS.
Drive communication and consistently report accurate product status for stakeholders
Able to lead a team of engineers, help them with technical issues. (80% self-work and 20%
influencing scrum engineers).
Balance time on development projects including Technical Design, Code Reviews, Mentoring,
and training. Able to break down requirements and build traceability in design and
implementation details.
Work with developers to define unit & automated tests and closely monitor development
milestones. Collaborate with scrum team to identify functional, system and end to end
integration of products leading to deployment.
Understand end to end flow in product development and able to prepare design documents
and present to Engineering and Product Leadership team.
Full stack product development experience.
Skills Required :
Bachelor’s/Master’s degree equivalent with strong knowledge methodology and tools.
8+ years working Experience in designing data, keyword driven or hybrid strategies; Ability
to troubleshoot and think out of the box
Experience in CICD pipeline configuration, creation, and maintenance – from build to deploy
to integration.
Experience in writing clear, concise and comprehensive design documents covering
functional and non-functional requirements.
Hands-on experience in large enterprise development in a multi-cloud environment.
Strong expertise over Java, Python language, Databases, experience in web Frameworks like
Django required for backend development.
Experience of working in AWS (S3, Lambda, RDS, Security, ILM and AWS Services).
Experience with Docker and Kubernetes for Container Management and Orchestration by
setting CI/CD Pipelines using Jenkins / Ansible.
Experience with API (REST/SOAP). Experienced in PowerBI, RDBMS, DB Architecture design
and good control over SQL queries.
Experience with any NoSQL Database, Caching and Messaging is a plus. Experience with
Messaging Tool and Caching Frameworks
Strong Understanding of fundamental concepts: Data Structures, Algorithms, OOPs
concepts, Design patterns and Architectures. Experience with Agile programming techniques
such as test-driven development. Design applications to optimize for performance and
usability.
The ideal candidate will be a BTech in Computer Science or an MCA well-versed in full stack development of business applications using PHP with MySQL and HTML as database and front end. Knowledge of other tech stacks is preferred as also understanding of MS Azure cloud environment. Familiarity with PowerBI will be useful.
Week off: Friday & Saurday
Day Shift.
Key responsibilities:
- Creating, designing and developing data models
- Prepare plans for all ETL (Extract/Transformation/Load) procedures and architectures
- Validating results and creating business reports
- Monitoring and tuning data loads and queries
- Develop and prepare a schedule for a new data warehouse
- Analyze large databases and recommend appropriate optimization for the same
- Administer all requirements and design various functional specifications for data
- Provide support to the Software Development Life cycle
- Prepare various code designs and ensure efficient implementation of the same
- Evaluate all codes and ensure the quality of all project deliverables
- Monitor data warehouse work and provide subject matter expertise
- Hands-on BI practices, data structures, data modeling, SQL skills
Hard Skills for a Data Warehouse Developer:
- Hands-on experience with ETL tools e.g., DataStage, Informatica, Pentaho, Talend
- Sound knowledge of https://www.freelancermap.com/it-projects/sql-1084" target="_blank">SQL
- Experience with SQL databases such as Oracle, DB2, and SQL
- Experience using Data Warehouse platforms e.g., SAP, Birst
- Experience designing, developing, and implementing Data Warehouse solutions
- Project management and system development methodology
- Ability to proactively research solutions and best practices
Soft Skills for Data Warehouse Developers:
- Excellent Analytical skills
- Excellent verbal and written communications
- Strong organization skills
- Ability to work on a team, as well as independently
The main roles and responsibilities of a Power BI developer are discussed below:
- Power BI development and administration.
- Building Analysis Services reporting models.
- Developing visual reports, dashboards and KPI scorecards using Power BI desktop.
- Connecting to data sources, importing data and transforming data for Business Intelligence.
- Excellent in analytical thinking for translating data into informative visuals and reports.
- Able to implement row level security on data and have an understanding of application security layer models in https://powerbi.microsoft.com/en-us/">Power BI.










