
Minimum Qualifications:
5+ years of experience with Linux/Unix system administration and networking fundamentals 3+ years in a Software Engineering role or equivalent experience
4+ years of working with AWS
4+ years of experience working with Kubernetes, Docker.
Strong skills in reading code as well as writing clean, maintainable, and scalable code
Good knowledge of Python
Experience designing, building, and maintaining scalable services and/or service-oriented architecture Experience with high-availability
Experience with modern configuration management tools (e.g. Ansible/AWX, Chef, Puppet, Pulumi) and idempotency
Bonus Requirements:
Knowledge of standard security practices
Knowledge of the Hadoop ecosystem (e.g. Hadoop, Hive, Presto) including deployment, scaling, and maintenance Experience with operating and maintaining VPN/SSH/ZeroTrust access infrastructure Experience with CDNs such as CloudFront and Akamai
Good knowledge of Javascript, Java, Golang

About 6sense
About
6sense reinvents the way organizations create, manage, and convert pipeline to revenue. The 6sense Revenue AI platform captures anonymous buying signals, predicts the right accounts to target at the ideal time, and recommends the channels and messages to boost revenue performance. Removing guesswork, friction and wasted sales effort, 6sense empowers sales, marketing, and customer success teams to significantly improve pipeline quality, accelerate sales velocity, increase conversion rates, and grow revenue predictably. 6sense has been recognized for its market-defining technology by Forbes Cloud 100, G2, TrustRadius, Gartner, and Forrester, and for its strong culture by Glassdoor, Inc. Magazine, and Comparably.
Photos
Connect with the team
Similar jobs
Job Title: AI Engineer - NLP/LLM Data Product Engineer Location: Chennai, TN- Hybrid
Duration: Full time
Job Summary:
About the Role:
We are growing our Data Science and Data Engineering team and are looking for an
experienced AI Engineer specializing in creating GenAI LLM solutions. This position involves collaborating with clients and their teams, discovering gaps for automation using AI, designing customized AI solutions, and implementing technologies to streamline data entry processes within the healthcare sector.
Responsibilities:
· Conduct detailed consultations with clients functional teams to understand client requirements, one use case is related to handwritten medical records.
· Analyze existing data entry workflows and propose automation opportunities.
Design:
· Design tailored AI-driven solutions for the extraction and digitization of information from handwritten medical records.
· Collaborate with clients to define project scopes and objectives.
Technology Selection:
· Evaluate and recommend AI technologies, focusing on NLP, LLM and machine learning.
· Ensure seamless integration with existing systems and workflows.
Prototyping and Testing:
· Develop prototypes and proof-of-concept models to demonstrate the feasibility of proposed solutions.
· Conduct rigorous testing to ensure accuracy and reliability.
Implementation and Integration:
· Work closely with clients and IT teams to integrate AI solutions effectively.
· Provide technical support during the implementation phase.
Training and Documentation:
· Develop training materials for end-users and support staff.
· Create comprehensive documentation for implemented solutions.
Continuous Improvement:
· Monitor and optimize the performance of deployed solutions.
· Identify opportunities for further automation and improvement.
Qualifications:
· Advanced degree in Computer Science, Artificial Intelligence, or related field (Masters or PhD required).
· Proven experience in developing and implementing AI solutions for data entry automation.
· Expertise in NLP, LLM and other machine-learning techniques.
· Strong programming skills, especially in Python.
· Familiarity with healthcare data privacy and regulatory requirements.
Additional Qualifications( great to have):
An ideal candidate will have expertise in the most current LLM/NLP models, particularly in the extraction of data from clinical reports, lab reports, and radiology reports. The ideal candidate should have a deep understanding of EMR/EHR applications and patient-related data.
Position: Lead Python Developer
Location: Ahmedabad, Gujarat
The Client company includes a team of experienced information services professionals who are passionate about growing and enhancing the value of information services businesses. They provide support with talent, technology, tools, infrastructure and expertise required to deliver across the Data ecosystem. Position Summary We are seeking a skilled and experienced Backend Developer with strong expertise in TypeScript, Python, and web scraping. You will be responsible for designing, developing, and maintaining scalable backend services and APIs that power our data-driven products. Your role will involve collaborating with cross functional teams, optimizing system performance, ensuring data integrity, and contributing to the design of efficient and secure architectures.
Job Responsibility
● Design, develop, and maintain backend systems and services using Python and TypeScript.
● Develop and maintain web scraping solutions to extract, process, and manage large-scale data from multiple sources.
● Work with relational and non-relational
databases, ensuring high availability, scalability, and performance.
● Implement authentication, authorization, and security best practices across services.
● Write clean, maintainable, and testable code following best practices and coding standards.
● Collaborate with frontend engineers, data engineers, and DevOps teams to deliver robust solutions and troubleshoot, debug, and upgrade existing applications.
● Stay updated with backend development trends, tools, and frameworks to continuously improve processes.
● Utilize core crawling experience to design efficient strategies for scraping the data from different websites and applications.
● Collaborate with technology teams, data collection teams to build end to end technology-enabled ecosystems and partner in research projects to analyze the massive data inputs.
● Responsible for the design and development of web crawlers, able to independently solve various problems encountered in the actual development process.
● Stay updated with the latest web scraping techniques, tools, and industry trends to continuously improve the scraping processes.
Job Requirements
● 4+ years of professional experience in backend development with TypeScript and Python.
● Strong understanding of TypeScript-based server-side frameworks (e.g., Node.js, NestJS, Express) and Python frameworks (e.g., FastAPI, Django, Flask).
● Experience with tools and libraries for web scraping (e.g., Scrapy, BeautifulSoup, Selenium, Puppeteer)
● Hands-on experience with Temporal for creating and orchestrating workflows
● Proven hands-on experience in web scraping, including crawling, data extraction, deduplication, and handling dynamic websites.
● Proficient in implementing proxy solutions and handling bot-detection challenges (e.g., Cloudflare).
● Experience working with Docker, containerized deployments, and cloud environments (GCP or Azure).
● Proficiency with database systems such as MongoDB and Elastic Search.
● Hands-on experience with designing and maintaining scalable APIs.
● Knowledge of software testing practices (unit, integration, end-to-end).
● Familiarity with CI/CD pipelines and version control systems (Git).
● Strong problem-solving skills, attention to detail, and ability to work in agile environments.
● Great communication skills and ability to navigate in undirected situations.
Job Exposure:
● Opportunity to apply creative methods in acquiring and filtering the North American government, agencies data from various websites, sources
● In depth industry exposure on data harvesting techniques to build, scale the robust and sustainable model, using open-source applications ● Effectively collaboration with IT team to design the tailor-made solutions basis upon clients’ requirement
● Unique opportunity to research on various agencies, vendors, products as well as technology tools to compose a solution
Responsibilities:
- Lead simultaneous development for multiple business verticals.
- Design & develop highly scalable, reliable, secure, and fault-tolerant systems.
- Ensure that exceptional standards are maintained in all aspects of engineering.
- Collaborate with other engineering teams to learn and share best practices.
- Take ownership of technical performance metrics and strive actively to improve them.
- Mentors junior members of the team and contributes to code reviews.
Requirements:
- A passion to solve tough engineering/data challenges.
- Be well versed with cloud computing platforms AWS/GCP
- Experience with SQL technologies (MySQL, PostgreSQL)
- Experience working with NoSQL technologies (MongoDB, ElasticSearch)
- Excellent Programming skills in Python/Java/GoLang
- Big Data streaming services (Kinesis, Kafka, RabbitMQ)
- Distributed cache systems(Redis, Memcache)
- Advanced data solutions(BigQuery, RedShift, DynamoDB, Cassandra)
- Automated testing frameworks and CI/CD pipelines Infrastructure orchestration(Docker/Kubernetes/Nginx)
- Cloud-native tech like Lambda, ASG, CDN, ELB, SNS/SQS, S3 Route53 SES
5 years in software development (Minimum 3 years)
Strong expertise in Ruby on Rails (3-5 years)
Knowledge of Python is a plus
hashtag
#Key hashtag
#Skills:
Proficiency in scalable app techniques: caching, APM, microservices architecture
Ability to write high-quality code independently
Experience mentoring junior engineers (0-2 years of experience)
What We Offer:
An opportunity to work with a dynamic team
A challenging environment where your skills will be put to the test
A chance to make a real impact by guiding and mentoring others
Ready to make your mark? If you're based out of or willing to relocate to Gurgaon and have the experience we're looking for, apply now!
About Us:
Developed in formal collaboration with the University of Cambridge in May 2000, HeyMath! is an Ed-Tech company whose mission is to Raise the Game in Maths for school systems around the world. We do this using technology to deliver engaging teaching methodologies and personalised learning paths for students. HeyMath! has been successfully adopted by CBSE schools since 2004, with positive outcomes for the entire ecosystem.
Check us out at www.heymath.com
We plan to work mainly from home in 2022 and the virtual office atmosphere is collegiate, informal and friendly, with small high-impact teams making a difference to customers.
What we are looking for:
Experience in building and re-engineering cloud based solutions on AWS.
Strong knowledge of Object Oriented Programming(OOPS) and design patterns is a must. Hands-on development on Spring MVC framework.
Experience working on Java 8 or above.
Must have very good knowledge of RDBMS such as MySQL and performance tuning of the same.
Exposure to server-side and client-side caching mechanisms. Ability to debug the applications and provide instant workable solutions.
Experience working on Elastic Search / Kafka / Kubernetes or all is a nice to have.

Programming Languages: Perl, java. Perl programming with strong OOPs knowledge.
UI: HTML, JS
System: Linux must have – good knowledge and shell scripting experience.
Prior experience in infrastructure automation, monitoring will definitely help.
Description:
The person in this role:
- Will be involved in developing new monitoring scripts, enhancement & defect fixes on existing monitors
- Have to be on-call to support any incoming production/P1 internal issues which need urgent attention (team members are on call for a week and we have a weekly rotation policy within the team)
Preferred skills:
- Perl
- Shell scripting
- Unix
- Jenkins













