

About Docon Technologies
About
Using our type-free, self-learning iPad app, the doctor stores all the important elements of the visit in under 2 minutes!
Connect with the team
Similar jobs

Key Responsibilities
Develop and deploy scalable AI-powered applications using the Python stack.
Leverage cutting-edge AI/ML technologies such as LangChain, LangGraph, AutoGen, Phidata, CrewAI Hugging Face, OpenAI APIs, PyTorch, TensorFlow, and other advanced frameworks to build innovative AI applications. Write clean, efficient, and well-documented code adhering to best practices.
Build and manage robust APIs for integrating AI solutions into applications.
Research and experiment with emerging technologies to discover new AI-driven use cases.
Deploy and manage AI solutions in cloud environments (AWS, Azure, GCP), ensuring security, scalability, and performance.
Collaborate with product managers, engineers, and UX/UI designers to define AI application requirements and align them with business objectives.
Apply MLOps principles to streamline AI model deployment, monitoring, and optimization.
Solve complex problems using foundational knowledge of generative AI, machine learning, and data processing techniques.
Contribute to continuous improvement of development processes and practices.
Resolve production issues by conducting effective troubleshooting and root cause analysis (RCA) within SLAs. Work with operations teams to support product deployment and issue resolution.
Requirements
Educational Background
Bachelor’s or Master’s degree in Computer Science or related fields with a strong academic track record. Experience
Technical Skills:
3+ years of hands-on experience in building and deploying AI applications on Python stack. Strong knowledge of Python and related frameworks.
Good knowledge of few of the AI/ML frameworks and agentic frameworks and platforms like LangChain, LangGraph, AutoGen, Hugging Face, CrewAI, OpenAI APIs, PyTorch, and TensorFlow etc.
Experience with AI/ML workflows, including data preparation, model deployment, and optimization. Proficiency in building and consuming RESTful APIs for connecting AI models with web applications. Knowledge of MLOps tools and practices, including model lifecycle management, CI/CD for AI, and model monitoring.
Familiarity with cloud platforms like AWS, Azure, or Google Cloud, including containerization (Docker) and orchestration (Kubernetes).
Experience with CI/CD pipelines, version control (Git), and automation frameworks. Strong understanding of algorithms, AI/ML fundamentals, and data preprocessing techniques.
Soft Skills:
Passion for exploring, experimenting and implementing emerging AI technologies. Self-starter who can independently and collaboratively in a fast-paced environment. Excellent problem-solving and analytical abilities to tackle complex challenges.
Effective communication skills to explain AI concepts and strategies to stakeholders.

**FOR COMPLETE FRESHER Training and Internship program in Our IT Company Logical Soft Tech Pvt Ltd, Indore(M.P)**
For INDORE Location Candidates Can visit office or Can attend Online also if you are from Outside*
Great Opportunities for IT Complete Freshers who want to learn Live Project working in IT company rather going in any Training Institute we have brought this great opportunity in our IT company with 100 % Guaranteed Job placement assistance and recommendation with personality development classes and interaction with our Developer team.
Duration : 1 to 9 months ( Depends on candidates and courses)
We have the following tech stack course for Training and Internship :-
1) for mobile app development ( Android Java / Kotlin / IOS swift / Flutter Dart )
2)For full stack web development frontend ( React.js, HTML,CSS, JS, BS, Ajax ) and Backend ( php CodeIgniter MVC / Node.js /Java Spring Boot / JSP / Servlet )
3) For Frontend React.js, HTML,CSS, JS, BS, Ajax , jQuery
4) For Backend ( php CodeIgniter MVC / Node.js / Java Spring Boot / JSP / Servlet)
5) for SEO/SMM/Digital Marketing
6) Manual testing
7) Game Developer
8) Blockchain Developer( Smart Contracts, Blockchain, crypto currency)
*For more enquiry and detail discussion and for registration process you may reach our HR Number or can walkins directly to our office between 11 Am to 7 pm Monday To Saturday : :
Company Name : Logical Soft Tech Pvt Ltd
Contact : - +91-78.69.73.15.95(HR), +91-74.15.95.09.19(HR), +91-82.69.82.97.29(HR), +91-82.10.25.18.24 (technical Department)
Email : - talentlogicalsofttech @gmail.com, logicalhr.softtech @gmail.com, hrlogicalsofttech @gmail.com,
Address: - 2nd floor, 388,PU4, Scheme 54 PU4, Next to Krozzon hotel, Infront of Old Eye Retina Hospital, Vijay Nagar, Indore, M.P
Also please fill out the google form , our HR team will get back to you
*https://forms.gle/6HYUGMp3A8WdvmDS9
HURRY UP ! as we are limited Seats ( https://www.instagram.com/p/C-evFytohcz/ )

About the Company:
Gruve is an innovative Software Services startup dedicated to empowering Enterprise Customers in managing their Data Life Cycle. We specialize in Cyber Security, Customer Experience, Infrastructure, and advanced technologies such as Machine Learning and Artificial Intelligence. Our mission is to assist our customers in their business strategies utilizing their data to make more intelligent decisions. As a well-funded early-stage startup, Gruve offers a dynamic environment with strong customer and partner networks.
Why Gruve:
At Gruve, we foster a culture of innovation, collaboration, and continuous learning. We are committed to building a diverse and inclusive workplace where everyone can thrive and contribute their best work. If you’re passionate about technology and eager to make an impact, we’d love to hear from you.
Gruve is an equal opportunity employer. We welcome applicants from all backgrounds and thank all who apply; however, only those selected for an interview will be contacted.
Position summary:
We are seeking a Senior Software Development Engineer – Data Engineering with 5-8 years of experience to design, develop, and optimize data pipelines and analytics workflows using Snowflake, Databricks, and Apache Spark. The ideal candidate will have a strong background in big data processing, cloud data platforms, and performance optimization to enable scalable data-driven solutions.
Key Roles & Responsibilities:
- Design, develop, and optimize ETL/ELT pipelines using Apache Spark, PySpark, Databricks, and Snowflake.
- Implement real-time and batch data processing workflows in cloud environments (AWS, Azure, GCP).
- Develop high-performance, scalable data pipelines for structured, semi-structured, and unstructured data.
- Work with Delta Lake and Lakehouse architectures to improve data reliability and efficiency.
- Optimize Snowflake and Databricks performance, including query tuning, caching, partitioning, and cost optimization.
- Implement data governance, security, and compliance best practices.
- Build and maintain data models, transformations, and data marts for analytics and reporting.
- Collaborate with data scientists, analysts, and business teams to define data engineering requirements.
- Automate infrastructure and deployments using Terraform, Airflow, or dbt.
- Monitor and troubleshoot data pipeline failures, performance issues, and bottlenecks.
- Develop and enforce data quality and observability frameworks using Great Expectations, Monte Carlo, or similar tools.
Basic Qualifications:
- Bachelor’s or Master’s Degree in Computer Science or Data Science.
- 5–8 years of experience in data engineering, big data processing, and cloud-based data platforms.
- Hands-on expertise in Apache Spark, PySpark, and distributed computing frameworks.
- Strong experience with Snowflake (Warehouses, Streams, Tasks, Snowpipe, Query Optimization).
- Experience in Databricks (Delta Lake, MLflow, SQL Analytics, Photon Engine).
- Proficiency in SQL, Python, or Scala for data transformation and analytics.
- Experience working with data lake architectures and storage formats (Parquet, Avro, ORC, Iceberg).
- Hands-on experience with cloud data services (AWS Redshift, Azure Synapse, Google BigQuery).
- Experience in workflow orchestration tools like Apache Airflow, Prefect, or Dagster.
- Strong understanding of data governance, access control, and encryption strategies.
- Experience with CI/CD for data pipelines using GitOps, Terraform, dbt, or similar technologies.
Preferred Qualifications:
- Knowledge of streaming data processing (Apache Kafka, Flink, Kinesis, Pub/Sub).
- Experience in BI and analytics tools (Tableau, Power BI, Looker).
- Familiarity with data observability tools (Monte Carlo, Great Expectations).
- Experience with machine learning feature engineering pipelines in Databricks.
- Contributions to open-source data engineering projects.
Subject Matter Expert also referred to as SME, is a person who has special skills or knowledge on a particular job or topic. An SME is considered an expert on a certain topic – not only educated on the subject but has the capacity to share their knowledge with other team members.
Qualification for SME Insurance:
- 3-5 experience in the insurance industry
- Experience in insurance products (all products including life, health & general) and processes and the insurance industry chain
- Bachelor’s degree (or equivalent)
- English: Fluent
Roles and responsibilities
- Strong Process & product knowledge of the Insurance products
- Keeping up-to-date with industry developments. Offer solutions and suggestions for process and product improvement to management.
- Responsible for facilitating OJT for the new joiners on all Insurance related products and processes.
- Help in designing a best-in-class customer support and engagement journey for insurance products.
- Liaise with different stakeholders/partners for quicker resolution of customer inquiries and requests.
- Ability to take ownership and work for early resolution and closure of customer issues.
- Ability to accept change and make the best of any situation.
- Proactive in highlighting/ escalating issues/ concerns/ Areas Of Improvements/ opportunities to the leadership team to get their help as and when needed
- Handling and monitoring of all complaints received through all mediums through tracking in defined TATs
- Ensure that complaint closure quality (RCA) is tracked closely by ensuring first understanding all the issues highlighted by the customer. Maintenance of all RCAs and process improvements undertaken for relevant complaints
- Keeping up to date with all the regulatory requirements and changes in the Insurance industry
2+ years experience as a developer with a proven track record in on-time and successful deliveries.
- Must be highly proficient in NodeJS, Javascript and MySQL with Microservices (REST web services )
- Work end to end on micro-services and SPAs built on cutting-edge technologies like Nodejs (Strong practical experience preferred)
- Strong knowledge of database & web applications.
- Preferred : grunt, wagner, npm, passport, redix, chai
- Excellent problem solving and communication skills, so that you are able to articulate technical concepts to the team, also excellent development, and debugging skills
- Ability to learn, act, and thrive in a fast-paced environment with a distributed team
Duties & Responsibilities :
- Should be taking requirements to design and building reusable modules.
- Building, scaling and performance tuning.
- Advance understanding of front-end technologies, such as HTML5 and CSS3.
- Advance understanding in AWS, NPM (Node JS Package Manager), GIT and Unit testing.
- Strong work experience in Design Patterns and Algorithms.
- Experience with debugging, troubleshooting and problem-solving issues.
- Must have strong communication skills.
Datametica is looking for talented Big Query engineers
Total Experience - 2+ yrs.
Notice Period – 0 - 30 days
Work Location – Pune, Hyderabad
Job Description:
- Sound understanding of Google Cloud Platform Should have worked on Big Query, Workflow, or Composer
- Experience in migrating to GCP and integration projects on large-scale environments ETL technical design, development, and support
- Good SQL skills and Unix Scripting Programming experience with Python, Java, or Spark would be desirable.
- Experience in SOA and services-based data solutions would be advantageous
About the Company:
www.datametica.com
Datametica is amongst one of the world's leading Cloud and Big Data analytics companies.
Datametica was founded in 2013 and has grown at an accelerated pace within a short span of 8 years. We are providing a broad and capable set of services that encompass a vision of success, driven by innovation and value addition that helps organizations in making strategic decisions influencing business growth.
Datametica is the global leader in migrating Legacy Data Warehouses to the Cloud. Datametica moves Data Warehouses to Cloud faster, at a lower cost, and with few errors, even running in parallel with full data validation for months.
Datametica's specialized team of Data Scientists has implemented award-winning analytical models for use cases involving both unstructured and structured data.
Datametica has earned the highest level of partnership with Google, AWS, and Microsoft, which enables Datametica to deliver successful projects for clients across industry verticals at a global level, with teams deployed in the USA, EU, and APAC.
Recognition:
We are gratified to be recognized as a Top 10 Big Data Global Company by CIO story.
If it excites you, please apply.
Role: US IT Recruiter - Freshers/Trainees
Experience: 0-6 months
Location: Hyderabad (Onsite)
Technovert is not a typical IT services firm. We have to credit two of our successful products generating $2M+ in licensing/SaaS revenues which is rare in the industry.
We are Obsessed with our love for technology and the infinite possibilities it can create for making this world a better place. Our clients find us at our best when we are challenged with their toughest of the problems and we love chasing the problems. It thrills us and motivates us to deliver more. Our global delivery model has earned the trust and reputation of being a partner of choice.
We have a strong heritage built on great people who put customers first and deliver exceptional results with no surprises - every time. We partner with you to understand the interconnection of user experience, business goals, and information technology. It's the optimal fusing of these three drivers that delivers
Responsibilities:
- Should understand the requirement in depth to ensure quality sourcing and recruiting
- Should ensure to submit profiles with quick turnaround time for timely submissions to meet sharp deadlines for each of the requisition
- Make use of effective resourcing strategies, such as head hunting, Internet sourcing, networking, employee referrals
Must have:
- Candidate must have good communication skill
- He/she must have a good attitude
- Proven ability to consistently and positively contribute in a high- paced, changing the work environment
- Excellent communication skills in written and verbal both
Qualification:
- 0-6 months of experience in US Staffing.
- Should be ready to work 80% of the time on W2 recruitment only.
- Good communication skills.
- Willing to work from office (Hyderabad)
- Experience in using job portals like Career Builder, Dice and LinkedIn would be a plus.
- Professional cold calling to convert suspect lead to prospect quality lead.
- Schedule business development meetings with prospective clients Email company profiles to prospective clients.
- Call on leads.
- Promote company services features with precise presentation and identify client requirements in the meeting.
- Share email quotation and the rate bifurcation.
- Ensuring adherence to SGB levies and Guidelines.
- Rapport building, Convincing and negotiating, and a strong follow-up to confirm the business with the client.
- Update new client details in the system and share them with the team.

POSITION SUMMARY
Essential Skills & Key Responsibilities
- Expertise in P2P domain which includes Procurement and Account Payables and Fusion
- P2P functional implementation experience including Interfaces & RICE objects with other legacy applications.
- Leading the business analysis capability in the procurement space, building excellent business connect, influencing and negotiating between tech and business.
- Capture & clarification of business requirements through a range of analysis techniques for the Business (data flow diagrams, case tool analysis, gap analysis and work & process flow diagrams)
- Function as a techno-functional resource and assist in designing solutions for business
- Understand the business data in depth, analyse key performance indicators and use the same to analyse the requirements further
- Obtaining agreement on business analysis deliverables and ensuring that they meet all the requirements of the business
- Proactively working with business to identify, define, and clarify the scope / issues in terms of complex business/systems requirements.
- Identify options for potential solutions and assessing them for both technical and business suitability/feasibility
- Working with business to capture & clarify business requirements & support business & technology teams in data analysis, in UAT phase etc.
- Acting as a proxy customer with the development teams, facilitating open communication between the customer & development team
- Driving user acceptance criteria with the customer
- Aware of the AGILE terminologies and should have knowledge of AGILE tools like JIRA, Confluence etc.
- Basic knowledge of PL/SQL for data querying & analysis will be an added advantage.
- Exposure to MS Office products like Visio and Excel
|

