9+ Data processing Jobs in India
Apply to 9+ Data processing Jobs on CutShort.io. Find your next job, effortlessly. Browse Data processing Jobs and apply today!
Computer Operator duties and responsibilities
- Identifying and correcting file and system errors.
- Performing data processing operations according to a business production schedule.
- Performing backup procedures to reduce the risk of data loss.
- Maintaining computer equipment and inventory and organizing repairs as needed.
at AxionConnect Infosolutions Pvt Ltd
Job Location: Hyderabad/Bangalore/ Chennai/Pune/Nagpur
Notice period: Immediate - 15 days
1. Python Developer with Snowflake
Job Description :
- 5.5+ years of Strong Python Development Experience with Snowflake.
- Strong hands of experience with SQL ability to write complex queries.
- Strong understanding of how to connect to Snowflake using Python, should be able to handle any type of files
- Development of Data Analysis, Data Processing engines using Python
- Good Experience in Data Transformation using Python.
- Experience in Snowflake data load using Python.
- Experience in creating user-defined functions in Snowflake.
- Snowsql implementation.
- Knowledge of query performance tuning will be added advantage.
- Good understanding of Datawarehouse (DWH) concepts.
- Interpret/analyze business requirements & functional specification
- Good to have DBT, FiveTran, and AWS Knowledge.
GCP Data Analyst profile must have below skills sets :
- Knowledge of programming languages like https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.simplilearn.com%2Ftutorials%2Fsql-tutorial%2Fhow-to-become-sql-developer&data=05%7C01%7Ca_anjali%40hcl.com%7C4ae720b3f3cc45c3e04608da3346b335%7C189de737c93a4f5a8b686f4ca9941912%7C0%7C0%7C637878675987971859%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=EImfaJAD1KHOyrBQ7FkbaPl1STtfnf4QdQlbjw72%2BmE%3D&reserved=0" target="_blank">SQL, Oracle, R, MATLAB, Java and https://apc01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.simplilearn.com%2Fwhy-learn-python-a-guide-to-unlock-your-python-career-article&data=05%7C01%7Ca_anjali%40hcl.com%7C4ae720b3f3cc45c3e04608da3346b335%7C189de737c93a4f5a8b686f4ca9941912%7C0%7C0%7C637878675987971859%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=Z2n1Xy%2F3YN6nQqSweU5T7EfUTa1kPAAjbCMTWxDCh%2FY%3D&reserved=0" target="_blank">Python
- Data cleansing, data visualization, data wrangling
- Data modeling , data warehouse concepts
- Adapt to Big data platform like Hadoop, Spark for stream & batch processing
- GCP (Cloud Dataproc, Cloud Dataflow, Cloud Datalab, Cloud Dataprep, BigQuery, Cloud Datastore, Cloud Datafusion, Auto ML etc)
at MulticoreWare
Senior Engineer – Artificial Intelligence / Computer Vision
(Business Unit – Autonomous Vehicles & Automotive - AVA)
We are seeking an exceptional, experienced senior engineer with deep expertise in Computer Vision, Neural Networks, 3D Scene Understanding and Sensor Data Processing. The expectation is to lead a growing team of engineers to help them build and deliver customized solutions for our clients. A solid engineering as well as team management background is a must.
About MulticoreWare Inc
MulticoreWare Inc is a software and solutions development company with top-notch talent and skill in a variety of micro-architectures, including multi-thread, multi-core, and heterogeneous hardware platforms. It works in sectors including High Performance Computing (HPC), Media & AI Analytics, Video Solutions, Autonomous Vehicle and Automotive software, all of which are rapidly expanding. The Autonomous Vehicles & Automotive business unit specializes in delivering optimized solutions for sophisticated sensor fusion intelligence and the design of algorithms & implementation of software to be deployed on a variety of automotive grade hardware platforms.
Role Responsibilities
● Lead a team to solve the problems in a perception / autonomous-systems scope and turn ideas into code & products
● Drive all technical elements of development, such as project requirements definition, design, implementation, unit testing, integration, and software delivery
● Implementing cutting edge AI solutions on embedded platforms and optimizing them for performance. Hardware architecture aware algorithm design and development
● Contribute to the vision and long-term strategy of the business unit
Required Qualifications (Must Have)
● 3 - 7 years of experience with real world system building, including design, coding (C++/Python) and evaluation/testing (C++/Python)
● Solid experience in 2D / 3D Computer Vision algorithms, Machine Learning and Deep Learning fundamentals – Theory & Practice. Hands-on experience with Deep Learning frameworks like Caffe, TensorFlow or PyTorch
● Expert level knowledge in any of the courses related Signal Data Processing / Autonomous or Robotics software development (Perception, Localization, Prediction, Planning), multi-object tracking, sensor fusion algorithms and familiarity on Kalman filters, particle filters, clustering methods etc.
● Good project management and execution capabilities, as well as good communication and coordination ability
● Bachelor’s degree in Computer Science, Computer Engineering, Electrical Engineering, or related fields
Preferred Qualifications (Nice-to-Have)
● GPU architecture and CUDA programming experience, as well as knowledge of AI inference optimization using Quantization, Compression (or) Model Pruning
● Track record of research excellence with prior publication on top-tier conferences and journals
They provide both wholesale and retail funding. PM1
- Key responsibility is to design & develop a data pipeline for real-time data integration, processing, executing of the model (if required), and exposing output via MQ / API / No-SQL DB for consumption
- Provide technical expertise to design efficient data ingestion solutions to store & process unstructured data, such as Documents, audio, images, weblogs, etc
- Developing API services to provide data as a service
- Prototyping Solutions for complex data processing problems using AWS cloud-native solutions
- Implementing automated Audit & Quality assurance Checks in Data Pipeline
- Document & maintain data lineage from various sources to enable data governance
- Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc
Skills
- Programming experience using Python & SQL
- Extensive working experience in Data Engineering projects, using AWS Kinesys, AWS S3, DynamoDB, EMR, Lambda, Athena, etc for event processing
- Experience & expertise in implementing complex data pipeline
- Strong Familiarity with AWS Toolset for Storage & Processing. Able to recommend the right tools/solutions available to address specific data processing problems
- Hands-on experience in Unstructured (Audio, Image, Documents, Weblogs, etc) Data processing.
- Good analytical skills with the ability to synthesize data to design and deliver meaningful information
- Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
- Ability to understand business functionality, processes, and flows
- Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently
Functional knowledge
- Real-time Event Processing
- Data Governance & Quality assurance
- Containerized deployment
- Linux
- Unstructured Data Processing
- AWS Toolsets for Storage & Processing
- Data Security
About the Organization
Real Estate Syndicators leverage SyndicationPro to manage billions in real estate assets and thousands of investors. Growing at 17% MoM, SyndicationPro is #1 Platform To Automate Real Estate Fund Raising, Investor Relations, & Close More Deals!
What makes SyndicationPro unique is that it is cash flow positive while maintaining a healthy growth rate and is backed by seasoned investors and real estate magnates. We are also a FirstPrinciples.io Venture Studio Company, giving us the product engineering, growth, and operational backbone in India to scale our team.
Our story has been covered by Bloomberg Business, Yahoo Finance, and several other renowned news outlets.
Why this Role
We are scaling our Customer Support & Success organization and you will have the opportunity to see how the systems and processes are built for an SaaS company that is rapidly scaling. Our Customer Support & Success Team is led by expierened SaaS Operators who can provide you the guidance and mentorship to growth quickly. You will also have the oppurtunity for faster than normal promotion cycles.
Roles and Responsibilities
- Work on Migrating sensitive customer data from a third-party investment platform or Excel to SyndicationPro with minimal supervision.
- Understand the client’s needs and set up the SyndicationPro platform to meet customer expectations.
- Analyze customer expectations and data to share an expected completion time.
- To manage multiple migrations at the same time to ensure migrations are completed within the stipulated time.
- Work closely with internal and customer-facing teams to deep dive on a customer migration request and workflow using our systems to ensure nothing falls to the bottom of the to-do list
- Reviewing data for deficiencies or errors, correcting any incompatibilities, and checking the output.
- Keep current on product releases and updates
Desired candidate profile:
- Proven data entry or Migration work experience will be an asset
- Experience with MS Office and data programs
- Attention to detail
- Confidentiality
- Organization skills, with an ability to stay focused on assigned tasks
Expectations from the candidate:
- 0-2 years of experience. Prior experience in SAAS environments will be an added advantage
- Resolve processing migration problems working with the technical team
- Pay attention to detail to maintain Data accuracy
- Self Motivated and willing to excel in strict /short deadline
- Have the ability to multitask as needed and time management skills
- Must be comfortable working during night shifts.
An experienced and hands-on Technical Architect to lead our Video analytics & Surveillance product
• An ideal candidate would have worked in large scale video platforms (Youtube, Netflix, Hotstar, etc) or Surveillance softwares
• As a Technical Architect, you are hands-on and also a top contributor to the product development
• Leading teams under time-sensitive projects
Skills Required:
• Expert level Python programming language skills is a MUST
• Hands-on experience with Deep Learning & Machine learning projects is a MUST
• Has to experience in design and development of products
• Review code & mentor team in improving the quality and efficiency of the delivery
• Ability to troubleshoot and address complex technical problems.
• Has to be a quick learner & ability to adapt to increasing customer demands
• Hands-on experience in design and deploying large scale docker and Kubernetes
• Can lead a technically strong team in sharpening the product further
• Strong design capability with microservices-based architecture and its pitfalls
• Should have worked in large scale data processing systems
• Good understanding of DevOps processes
• Familiar with Identity management, Authorization & Authentication frameworks
• Possesses very strong Software Design, enterprise networking systems, advanced problem-solving skills
• Experience writing technical architecture documents
MNC Pune based IT company
CANDIDATE WILL BE DEPLOYED IN A FINANCIAL CAPTIVE ORGANIZATION @ PUNE (KHARADI)
Below are the job Details :-
Experience 10 to 18 years
Mandatory skills –
- data migration,
- data flow
The ideal candidate for this role will have the below experience and qualifications:
- Experience of building a range of Services in a Cloud Service provider (ideally GCP)
- Hands-on design and development of Google Cloud Platform (GCP), across a wide range of GCP services including hands on experience of GCP storage & database technologies.
- Hands-on experience in architecting, designing or implementing solutions on GCP, K8s, and other Google technologies. Security and Compliance, e.g. IAM and cloud compliance/auditing/monitoring tools
- Desired Skills within the GCP stack - Cloud Run, GKE, Serverless, Cloud Functions, Vision API, DLP, Data Flow, Data Fusion
- Prior experience of migrating on-prem applications to cloud environments. Knowledge and hands on experience on Stackdriver, pub-sub, VPC, Subnets, route tables, Load balancers, firewalls both for on premise and the GCP.
- Integrate, configure, deploy and manage centrally provided common cloud services (e.g. IAM, networking, logging, Operating systems, Containers.)
- Manage SDN in GCP Knowledge and experience of DevOps technologies around Continuous Integration & Delivery in GCP using Jenkins.
- Hands on experience of Terraform, Kubernetes, Docker, Stackdriver, Terraform
- Programming experience in one or more of the following languages: Python, Ruby, Java, JavaScript, Go, Groovy, Scala
- Knowledge or experience in DevOps tooling such as Jenkins, Git, Ansible, Splunk, Jira or Confluence, AppD, Docker, Kubernetes
- Act as a consultant and subject matter expert for internal teams to resolve technical deployment obstacles, improve product's vision. Ensure compliance with centrally defined Security
- Financial experience is preferred
- Ability to learn new technologies and rapidly prototype newer concepts
- Top-down thinker, excellent communicator, and great problem solver
Exp:- 10 to 18 years
Location:- Pune
Candidate must have experience in below.
- GCP Data Platform
- Data Processing:- Data Flow, Data Prep, Data Fusion
- Data Storage:- Big Query, Cloud Sql,
- Pub Sub, GCS Bucket
REQUIREMENT:
- Previous experience of working in large scale data engineering
- 4+ years of experience working in data engineering and/or backend technologies with cloud experience (any) is mandatory.
- Previous experience of architecting and designing backend for large scale data processing.
- Familiarity and experience of working in different technologies related to data engineering – different database technologies, Hadoop, spark, storm, hive etc.
- Hands-on and have the ability to contribute a key portion of data engineering backend.
- Self-inspired and motivated to drive for exceptional results.
- Familiarity and experience working with different stages of data engineering – data acquisition, data refining, large scale data processing, efficient data storage for business analysis.
- Familiarity and experience working with different DB technologies and how to scale them.
RESPONSIBILITY:
- End to end responsibility to come up with data engineering architecture, design, development and then implementation of it.
- Build data engineering workflow for large scale data processing.
- Discover opportunities in data acquisition.
- Bring industry best practices for data engineering workflow.
- Develop data set processes for data modelling, mining and production.
- Take additional tech responsibilities for driving an initiative to completion
- Recommend ways to improve data reliability, efficiency and quality
- Goes out of their way to reduce complexity.
- Humble and outgoing - engineering cheerleaders.