

We have an urgent requirements of Big Data Developer profiles in our reputed MNC company.
Location: Pune/Bangalore/Hyderabad/Nagpur
Experience: 4-9yrs
Skills: Pyspark,AWS
or Spark,Scala,AWS
or Python Aws

About Persistent Systems
About
Company video


Connect with the team
Similar jobs

· Design, develop, and implement AI/ML models and algorithms.
· Focus on building Proof of Concept (POC) applications to demonstrate the feasibility and value of AI solutions.
· Write clean, efficient, and well-documented code.
· Collaborate with data engineers to ensure data quality and availability for model training and evaluation.
· Work closely with senior team members to understand project requirements and contribute to technical solutions.
· Troubleshoot and debug AI/ML models and applications.
· Stay up-to-date with the latest advancements in AI/ML.
· Utilize machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) to develop and deploy models.
· Develop and deploy AI solutions on Google Cloud Platform (GCP).
· Implement data preprocessing and feature engineering techniques using libraries like Pandas and NumPy.
· Utilize Vertex AI for model training, deployment, and management.
· Integrate and leverage Google Gemini for specific AI functionalities.
Qualifications:
· Bachelor’s degree in computer science, Artificial Intelligence, or a related field.
· 3+ years of experience in developing and implementing AI/ML models.
· Strong programming skills in Python.
· Experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn.
· Good understanding of machine learning concepts and techniques.
· Ability to work independently and as part of a team.
· Strong problem-solving skills.
· Good communication skills.
· Experience with Google Cloud Platform (GCP) is preferred.
· Familiarity with Vertex AI is a plus.
- Planning concepts by studying relevant information and materials.
- Illustrating concepts by designing examples of art arrangement, size, type size and style and submitting them for approval.
- Preparing finished art by operating necessary equipment and software.
- Coordinating with outside agencies, art services, web designer, marketing, printers, and colleagues as necessary.
- Contributing to team efforts by accomplishing tasks as needed.
- Communicating with clients about layout and design.
- Creating a wide range of graphics and layouts for product illustrations, company logos, and websites with software such as photoshop.
- Reviewing final layouts and suggesting improvements when necessary.



Your tool box :
Exp: 4+ Strong C/C++/C#/.net Core development skills with a good understanding of object-
oriented and multi-threaded design.
Strong background of computer science fundamentals (data structures, algorithms)
Passionate to learn and explore new technologies and demonstrates good analysis and
problem-solving skills.
Good written and verbal communication skills, should be a quick learner and a team player.
B.E. /B-Tech (CS/IT) • MCA/M.E./M-Tech (CS/IT)
Big Plus [ Mastering one or more of below ]:
Network troubleshooting skills [ TCP/IP, SSH, HTTPS ]
Hands on Kubernetes and Cloud environment
Hands On experience on UNIX or LINUX operating systems.
Strong with VoIP technologies [ SIP and RTP ]
Good understating of SOA architecture
In-depth knowledge and hands-on experience with all of the AWS services and other similar cloud services
Strong knowledge of core architectural concepts including distributed computing , scalability, availability, and performance to recommend the best backend solutions for our products
Preferred AWS Certifications:
- AWS Solutions Architect Professional/Associate AWS DevOps Engineer Professional
- AWS SysOps Administrator - Associate AWS Developer Associate
Key Skills
- ITCAN is looking for an AWS Solution Architect who will be responsible for development of scalable, optimized, and reliable backend solutions using AWS services for all our products. You will ensure that our products consume AWS services in the mast effective methods. Therefore, a commitment to collaborative problem solving, sophisticated design, and quality product is important.
Responsibilities:
- Analyse requirements and devise innovative, efficient, and cost-effective architecture using AWS components and services that ensure scalability, availability and high- performance.
- Develop automation and deployment utilities using Ruby, Bash and Shell scripting and implementing
- CI/CD pipelines using Jenkins, Code Deploy, Git, Code Pipeline, Code Commit etc. To ensure seamless deployment with no downtime.
- Redesign architectures end Lo-end seamlessly by working through major software upgrades such as Apache.
- Ensure an always-running network with the ability to set up redundant DNS systems with failover capabilities.
- Ensure the AWS services consumed are aligned with best practices to ensure higher availability and security along with optimal cost utilization.
- Using AWS-managed services, implement ELK systems end-to-end.


Quantum Corp. leads the world in helping users collaboratively solve some of the world's hardest computer challenges. Whether its wrangling Petabytes of data for cutting edge movie production, managing global content production workflows, exploring new sources of energy, or managing oceans of content and blending collaborative high-speed workflows with cutting-edge object storage, Quantum has the tools and technology to help you engineer the solution that works now - and preserves your work for decades to come. You are applying to an exceptional team that contributes to enhancing our position as a proven global expert in data management.
Job Summary and Duties:
This position will highlight new product development, as well as feature development and bug fixing of existing products serving the expanding big data, NAS, virtualization, replication, and file systems market. This position is required to be well rounded in operating system including all flavors of Linux, Windows, and MacOS. Building specialized core (user space and kernel space) software is the main function of this role.
Specific duties include but are not limited to:
• Development and maintenance of new Primary Storage products.
• Investigating and resolving issues in existing products related application workflows, cross-platform interactions, and new operating systems.
• Triage and disposition incoming issues from support cases, customer interactions, and new feature requests.
Job Requirements:
• Minimum 8 years of combined education and experience including 5 years of C/C++ programming.
• Expertise with SMB/NFS networking protocol stacks (Samba).
• Filesystem VFS layer expertise (Samba VFS modules).
• Experience with Python and Linux shell programming.
• Experience with SAN and Ethernet networking technologies.
• Ability to work in a team environment.
• Strong communication skills.
Desired Skills:
• Knowledge of kernel internals including any of the following: Linux and Mac OS X.
• Interfaces with special purpose file system APIs and web services required.
• Knowledge and experience with container technologies (Kubernetes, Docker)
• Knowledge and experience with Virtualization technologies: ESX, KVM
• Experience working in an Agile environment using CI methodologies.
-
Experience as a SAP Consultant with at least one end-to-end SAP implementation project.
-
Hands on experience in the requirements gathering/ fit-gap, design/blueprinting and configuration/customization phase of SAP transformation programs.
-
Deep understanding of business processes as well as good knowledge of technical issues in the area of financial modules.
-
Very good command of English.
-
Team management and project management skills as an asset.


Should be able to join in 30 days or less.



What you will be doing:
As a part of the Global Credit Risk and Data Analytics team, this person will be responsible for carrying out analytical initiatives which will be as follows: -
- Dive into the data and identify patterns
- Development of end-to-end Credit models and credit policy for our existing credit products
- Leverage alternate data to develop best-in-class underwriting models
- Working on Big Data to develop risk analytical solutions
- Development of Fraud models and fraud rule engine
- Collaborate with various stakeholders (e.g. tech, product) to understand and design best solutions which can be implemented
- Working on cutting-edge techniques e.g. machine learning and deep learning models
Example of projects done in past:
- Lazypay Credit Risk model using CatBoost modelling technique ; end-to-end pipeline for feature engineering and model deployment in production using Python
- Fraud model development, deployment and rules for EMEA region
Basic Requirements:
- 1-3 years of work experience as a Data scientist (in Credit domain)
- 2016 or 2017 batch from a premium college (e.g B.Tech. from IITs, NITs, Economics from DSE/ISI etc)
- Strong problem solving and understand and execute complex analysis
- Experience in at least one of the languages - R/Python/SAS and SQL
- Experience in in Credit industry (Fintech/bank)
- Familiarity with the best practices of Data Science
Add-on Skills :
- Experience in working with big data
- Solid coding practices
- Passion for building new tools/algorithms
- Experience in developing Machine Learning models
- Advise student/ parents for their learning needs through structured Counseling Sessions
- Fix appointments and conduct home demo sessions on daily basis including follow up sessions
- Understand Customer profile & problems to explain implication of ineffective learning methods
- Create the need for Smart Learning and advise student-parent to buy Toppr Subscription as solution
- Handle Objections and Price Negotiation to generate Sales Revenue



