
Artificial Intelligence Resercher (Computer vision)
Responsibility
• Work on Various SOTA Computer Vision Models, Dataset Augmentation & Dataset Generation
Techniques that help improve model accuracy & precision.
• Work on development & improvement of End-to-End Pipeline use cases running at scale.
• Programming skills with multi-threaded GPU CUDA computing and API Solutions.
• Proficient with Training of Detection, Classification & Segmentation Models with TensorFlow,
Pytorch, MX Net etc
Required Skills
• Strong development skills required in Python and C++.
• Ability to architect a solution based on given requirements and convert the business requirements into a technical computer vision problem statement.
• Ability to work in a fast-paced environment and coordinate across different parts of different projects.
• Bringing in the technical expertise around the implementation of best coding standards and
practices across the team.
• Extensive experience of working on edge devices like Jetson Nano, Raspberry Pi and other GPU powered low computational devices.
• Experience with using Docker, Nvidia Docker, Nvidia NGC containers for Computer Vision Deep
Learning
• Experience with Scalable Cloud Deployment Architecture for Video Analytics(Involving Kubernetes
and or Kafka)
• Good experience with any of one cloud technologies like AWS, Azure and Google Cloud.
• Experience in working with Model Optimisation for Nvidia Hardware (Tensors Conversion of both TensorFlow & Pytorch models.
• Proficient understanding of code versioning tools, such as Git.
• Proficient in Data Structures & Algorithms.
• Well versed in software design paradigms and good development practices.
• Experience with Scalable Cloud Deployment Architecture for Video Analytics(Involving Kubernetes
and or Kafka).

About Daten Wissen Pvt Ltd
Similar jobs
Job Title: Senior Python Developer – Product Engineering
Location: Pune, India
Experience Required: 3 to 7 Years
Employment Type: Full-time
Employment Agreement: Minimum 3 years (At the completion of 3 years, One Time Commitment Bonus will be applicable based on performance)
🏢 About Our Client
Our client is a leading enterprise cybersecurity company offering an integrated platform for Digital Rights Management (DRM), Enterprise File Sync and Share (EFSS), and Content-Aware Data Protection (CDP). With patented technologies for secure file sharing, endpoint encryption, and real-time policy enforcement, helps organizations maintain control over sensitive data — even after it leaves the enterprise perimeter.
🎯 Role Overview
We are looking for a skilled Python Developer with a strong product mindset and experience building scalable, secure, and performance-critical systems. You will join our core engineering team to enhance backend services powering DRM enforcement, file tracking, audit logging, and file sync engines.
This is a hands-on role for someone who thrives in a product-first, security-driven environment and wants to build technologies that handle terabytes of enterprise data across thousands of endpoints.
🛠️ Key Responsibilities
● Develop and enhance server-side services for DRM policy enforcement, file synchronization, data leak protection, and endpoint telemetry.
● Build Python-based backend APIs and services that interact with file systems, agent software, and enterprise infrastructure.
● Work on delta sync, file versioning, audit trails, and secure content preview/rendering services.
● Implement secure file handling, encryption workflows, and token-based access controls across modules.
● Collaborate with DevOps to optimize scalability, performance, and availability of core services across hybrid deployments (on-prem/cloud).
● Debug and maintain production-level services; drive incident resolution and performance optimization.
● Integrate with 3rd-party platforms such as LDAP, AD, DLP, CASB, and SIEM systems.
● Participate in code reviews, architecture planning, and mentoring junior developers.
📌 Required Skills & Experience
● 3+ years of professional experience with Python 3.x, preferably in enterprise or security domains.
● Strong understanding of multithreading, file I/O, inter-process communication, and low-level system APIs.
● Expertise in building RESTful APIs, schedulers, workers (Celery), and microservices.
● Solid experience with encryption libraries (OpenSSL, cryptography.io) and secure coding practices.
● Hands-on experience with PostgreSQL, Redis, SQLite, or other transactional and cache stores.
● Familiarity with Linux internals, filesystem hooks, journaling/logging systems, and OS-level operations.
● Experience with source control (Git), containerization (Docker/K8s), and CI/CD.
● Proven ability to write clean, modular, testable, and scalable code for production environments.
➕ Preferred/Bonus Skills
● Experience in EFSS, DRM, endpoint DLP, or enterprise content security platforms.
● Knowledge of file diffing algorithms (rsync, delta encoding) or document watermarking.
● Prior experience with agent-based software (Windows/Linux), desktop sync tools, or version control systems.
● Exposure to compliance frameworks (e.g., DPDP Act, GDPR, RBI-CSF) is a plus.
🌟 What We Offer
● Work on a patented and mission-critical enterprise cybersecurity platform
● Join a fast-paced team focused on innovation, security, and customer success
● Hybrid work flexibility with competitive compensation and growth opportunities
● Direct impact on product roadmap, architecture, and IP development
Job Role
· Position Title: Expert Java Engineer
· Experience Range: 9 to 12 yrs
· Location: Pune
Notice Period : Immediate Joiner
Must have Requirements
● 9+ years’ of experience working as a software developer.
● Strong proficiency in Java and Spring Boot.
● Strong experience in building applications that interact with relational databases using SQL.
● Some experience of Enterprise Java (J2EE / JavaEE / Spring) application architectures.
● History of delivering high-cadence modern applications with applied Agile methodologies, test-first development approaches, adopting CI/CD pipelines and using Git version control.
• Preparing and delivering technical presentations for clients, in order to explain the products and potential deliverable.
• Understand customer requirements and convey the same to the management.
• Help in product modification based on customer needs.
• Maintain relationship with existing customers – take feedbacks, resolution of problems related to installing equipment, renew orders etc.
• Preparation of regular reports.
• Recording and maintaining client data.
• Understanding client quotations and negotiating with
Job Description
Overview:
We are seeking an experienced Azure Data Engineer to join our team in a hybrid Developer/Support capacity. This role focuses on enhancing and supporting existing Data & Analytics solutions by leveraging Azure Data Engineering technologies. The engineer will work on developing, maintaining, and deploying IT products and solutions that serve various business users, with a strong emphasis on performance, scalability, and reliability.
Must-Have Skills:
Azure Databricks
PySpark
Azure Synapse Analytics
Key Responsibilities:
- Incident classification and prioritization
- Log analysis and trend identification
- Coordination with Subject Matter Experts (SMEs)
- Escalation of unresolved or complex issues
- Root cause analysis and permanent resolution implementation
- Stakeholder communication and status updates
- Resolution of complex and major incidents
- Code reviews (Per week 2 per individual) to ensure adherence to standards and optimize performance
- Bug fixing of recurring or critical issues identified during operations
- Gold layer tasks, including enhancements and performance tuning.
- Design, develop, and support data pipelines and solutions using Azure data engineering services.
- Implement data flow and ETL techniques leveraging Azure Data Factory, Databricks, and Synapse.
- Cleanse, transform, and enrich datasets using Databricks notebooks and PySpark.
- Orchestrate and automate workflows across services and systems.
- Collaborate with business and technical teams to deliver robust and scalable data solutions.
- Work in a support role to resolve incidents, handle change/service requests, and monitor performance.
- Contribute to CI/CD pipeline implementation using Azure DevOps.
Technical Requirements:
- 4 to 6 years of experience in IT and Azure data engineering technologies.
- Strong experience in Azure Databricks, Azure Synapse, and ADLS Gen2.
- Proficient in Python, PySpark, and SQL.
- Experience with file formats such as JSON and Parquet.
- Working knowledge of database systems, with a preference for Teradata and Snowflake.
- Hands-on experience with Azure DevOps and CI/CD pipeline deployments.
- Understanding of Data Warehousing concepts and data modeling best practices.
- Familiarity with SNOW (ServiceNow) for incident and change management.
Non-Technical Requirements:
- Ability to work independently and collaboratively in virtual teams across geographies.
- Strong analytical and problem-solving skills.
- Experience in Agile development practices, including estimation, testing, and deployment.
- Effective task and time management with the ability to prioritize under pressure.
- Clear communication and documentation skills for project updates and technical processes.
Technologies:
- Azure Data Factory
- Azure Databricks
- Azure Synapse Analytics
- PySpark / SQL
- Azure Data Lake Storage (ADLS), Blob Storage
- Azure DevOps (CI/CD pipelines)
Nice-to-Have:
- Experience with Business Intelligence tools, preferably Power BI
- DP-203 certification (Azure Data Engineer Associate)
NOTE -
Weekly rotational shifts -
11am to 8pm
2pm to 11pm
5pm to 2 am
P.S. - In any one weekend they should be available in call. If there is any issues alone they should work on that. there will be on call support monthly once.
Relevant in React JS: 2 + Years
Looking for candidates from Mumbai who can go to the office in Mumbai(Malad) after Covid ends.
- Knowledge of OO JavaScript
- 7.2. 10+ years hands-on experience with web development using
- HTML5, JavaScript (React, Preact.js, Redux, Redux, Vue.js,
- CSS3, SignalR, Socket.io)
- 7.3. Experience building responsive design layouts using a formal
- framework like Bootstrap and familiarity with best practices
- (web security concepts, ensuring browser & device
- compatibility, etc.)
- 7.4. Experience in creating secure pre-compiled script libraries.
- 7.5. Familiar with development and debugging tools for cross-
- browser issues
- 7.6. Experience integrating with RESTful APIs for server-side
Job Profile :- Female Telecaller Sales Executive
Job Location :- Bhopal, Madhya Pradesh
Qualification :- Any Graduate
Salary :- 8000 + incentives
Industry type :- IT Services & Consulting
Roles & Responsibilites :-
. Ability to make calls per day with the provided leads.
· Ability to convince the customers and generate meetings through calling.
. Follow up with incoming leads. Generate prospects over the phone.
. The goal is to help the company grow by bringing in customers and developing business.
. Good business communication skills with proficiency in English.
· Taking Inbound sales calls and outbound sales calls.
Regards
Sakshi
What Arcana (previously Newfang) Does:
Arcana is The Storage Layer for Ethereum with a Privacy Stack for developers to build secure and privacy preserving apps. Users care about privacy now more than ever before and it is becoming a first-class citizen in the modern development stack. Arcana makes it easier for apps to be privacy-focused out of the box.
With Arcana’s privacy stack, developers give control of data back to their users. They can easily integrate our SDK with their apps and use our modular feature blocks for Authentication, Identity Management, Encryption, Access Control, and Decentralized Storage. All of which are secured by and verifiable on the blockchain.
Visit http://www.arcana.network/">www.arcana.network to learn more about us, our backers, and our offerings
Role:
As the first QA hire at arcana, you will be responsible for setting up testing frameworks and leading QA for all our products and releases
Responsibilities:
- Design and develop a QA framework. Automate parts of the framework wherever feasible.
- Develop and implement test cases, scripts, plans, and procedures (manual and automated)
- Track key QA metrics for product and maintain a process for logging bugs/issues and debugging them.
- Handle QA validations of UI, backend APIs, databases, SDK, and Performance.
Skill Set Required:
- A degree in Computer Science or equivalent
- 2+ years experience in a testing role at a platform/infrastructure software company
- Demonstrable Experience in using UI automation frameworks like Selenium, Appium, etc. and automated API testing using tools like Postman or similar.
- Experience in Web testing, API testing, Mobile testing, and performance, security and usability testing
- Familiarity with test cycles (Unit, Regression, Functional, Systems, Stress & Scale, Smoke & Sanity)
Bonus:
- Familiarity with blockchain technology and distributed system design
- Experience with Solidity smart contract testing
- Prior software engineering experience
- Handled QA for open source projects
- As a DevOps Engineer, you need to have strong experience in CI/CD pipelines.
- Setup development, testing, automation tools, and IT infrastructure
- Defining and setting development, test, release, update, and support processes for DevOps operation
- Selecting and deploying appropriate CI/CD tools
- Deploy and maintain CI/CD pipelines across multiple environments (Mobile, Web API’s & AIML)
Required skills & experience:
- 3+ years of experience as DevOps Engineer and strong working knowledge in CI/CD pipelines
- Experience administering and deploying development CI/CD using Git, BitBucket, CodeCommit, Jira, Jenkins, Maven, Gradle, etc
- Strong knowledge in Linux-based infrastructures and AWS/Azure/GCP environment
- Working knowledge on AWS (IAM, EC2, VPC, ELB, ALB, Autoscaling, Lambda, etc)
- Experience with Docker containerization and clustering (Kubernetes/ECS)
- Experience on Android source(AOSP) clone, build, and automation ecosystems
- Knowledge of scripting languages such as Python, Shell, Groovy, Bash, etc
- Familiar with Android ROM development and build process
- Knowledge of Agile Software Development methodologies
Job Description:
- The Position will gather user requirements, create and maintain procedures, and assist in the configuration/customization of the system.This position requires flexibility with changing priorities.
- Generates new code and corrects, convert’s, and/or modifies existing code to meet specifications.
- Prepares detailed specifications from which code will be written.
- Writes and update’s Technical Documentation such as users' manuals, product specification’s, and Training Materials.
- Performs a variety of Testing Procedures on assigned products, analyzes test results, and correct the problems.
- Good Knowledge in Asp.Net , C# Windows application , MVC/Web API & SQL Experienced in integrating clinical Diagnostic Processes into LIS processes through software development Life-cycle test, and verification and validation. Development of instrument interfaces from LIS to Clinical Laboratory Instruments, and data interfaces with external Stakeholders. Knowledge and Experience in LIS life cycle form Implementing requirements and Design through Configuration and Development
- Strong knowledge in Java and Golang.
- REST API experience/writing gRPC services is a must.
- Experience with CI/CD, Docker, Kubernetes, AWS is necessary.
- Experience in building Android mobile applications a big plus.
- Strong knowledge of database systems and experience working on scalable systems is preferred.
- Experience in writing readable and maintainable code with proper documentation.
- Have burning desire and fire to work hard, quick learner, hunger to succeed, and have fun at the same time.
- No politics. No rules. No micromanagement. Just get the work done with great code quality.
- One who can understand product development and foresee future product features.








