
- Data & Analytics team is responsible to integrate new data sources and build data models, data dictionaries and machine learning models for the Wholesale Bank.
- The goal is to design and build data products to support squads in Wholesale Bank with business outcomes and development of business insights. In this Job Family we make a distinction between Data Analysts and Data Scientist. Both scientists as analysts work with data and are expected to write queries, work with engineering teams to source the right data, perform data munging (getting data into the correct format, convenient for analysis/interpretation) and derive information from data.
- The data analyst typically works on simpler structured SQL or similar databases or with other BI tools/packages. The Data Scientists are expected to build statistical models or be hands-on in machine learning and advanced programming.
- Role of Data Scientist to support our Corporate banking teams with insights gained from analyzing company data. The ideal candidate is adept at using large data sets to find opportunities for product and process optimization and using models to test the effectiveness of different courses of action. They must have strong experience using a variety of data mining/data analysis methods, using a variety of data tools, building and implementing models, using/creating algorithms and creating/running simulations. They must have banking or corporate banking experience.
6 Years - 10 Years
Analytics
- Should be comfortable in solving Wholesale Banking domain analytical solution within AI/ML platform

Similar jobs
Responsibilities
We're searching for a full-time Business Development Intern to join our team. We're inviting applications to join our closely-knit group, where we're passionate about the art of sales.
You'll be part of our outbound sales team. This role would require you to identify the right prospects for Zipy, generate leads, and set up demos.
If you thrive under pressure, possess a creative mind, and are ready to join a dynamic team that's redefining the canvas, we want to hear from you.
Note: This is a 6-month work-from-office internship in Pune. The company may offer you a full-time position upon completion of your internship subject to satisfactory performance.
Who are you?
You have -
- Excellent listening skills
- Captivating communication abilities
- Strong research and analytical capabilities
- A creative and open mindset
- The confidence to express your ideas while constructively challenging others
- A great sense of humour
Company Description
Zipy is a company that aims to fix what matters by helping companies identify and prioritize resolving customer issues based on real-time customer journeys. We are dedicated to helping our customers deliver bugless and exceptional digital experiences.
Must have good knowledge of GST Act and expertise in excel and tally
Should have experience in dealing with GST department
Should have experience in handling GST audit
Preparation of data for filing of GSTR 1 & GSTR 3B
Role: Taxation Executive
Industry Type: Accounting / Auditing
Department: Finance & Accounting
Employment Type: Full Time, Permanent
Role Category: Accounting & Taxation
- Provision Dev Test Prod Infrastructure as code using IaC (Infrastructure as Code)
- Good knowledge on Terraform
- In-depth knowledge of security and IAM / Role Based Access Controls in Azure, management of Azure Application/Network Security Groups, Azure Policy, and Azure Management Groups and Subscriptions.
- Experience with Azure and GCP compute, storage and networking (we can also look for GCP )
- Experience in working with ADLS Gen2, Databricks and Synapse Workspace
- Experience supporting cloud development pipelines using Git, CI/CD tooling, Terraform and other Infrastructure as Code tooling as appropriate
- Configuration Management (e.g. Jenkins, Ansible, Git, etc...)
- General automation including Azure CLI, or Python, PowerShell and Bash scripting
- Experience with Continuous Integration/Continuous Delivery models
- Knowledge of and experience in resolving configuration issues
- Understanding of software and infrastructure architecture
- Experience in Paas, Terraform and AKS
- Monitoring, alerting and logging tools, and build/release processes Understanding of computing technologies across Windows and Linux
8-16 years of overall database administration and support/operations experience, with a successful delivery track record.
# Proficient knowledge in MySQL, HANA administration.
#Should have knowledge in setting up DR and HA approaches
#Should have strong Knowledge in Backup & Recovery.
# Should have played a L2 role in current position
# Experience of working in large landscape/environment where multiple database flavors are running.
# Knowledge on Database capacity planning/strategy is an added advantage
# Knowledge & Experience on other database as HANA flavors are added advantage
# Hands on experience with expert knowledge of Unix/Linux flavors w.r.t Database.
# Good understanding of operational framework like ITIL/ITSM.
# Should be capable of Owning, Leading and coordinate operational tasks, customer escalations, process improvements.
# Excellent presentation & communication skills are a must.
Founded by two MDI alumnus, it is a student centric and personalised learning platform that delivers enjoyable learning content as per the state boards. This ed-tech provides a solution which is easy to use, lets students enjoy learning, makes life easy for a teacher and delivers learning in the language that students are most comfortable. The organisation has worked in 14 states across India and awarded Google India under "Impacting Change through Digital".
- Writing blogs, write-ups and various contents for marketing and communications
- Ideating and writing content/ script for success stories in the form of writing, visual content and audio-visual outputs
- Posting on various channels and managing social media handles
- Designing using tools
Desired Candidate Profile
What you need to have:- Focus and Discipline
- Analytical, problem solving and calculative attitude
- Humility to learn, share & keep improving
- Excellent and proactive communication in all formats
- You will be part of a collaborative project team
- Will leverage several proprietary and standard tools to implement technical solutions.
- Responsible for understanding business requirements, providing estimation, developing solution, writing unit test scenarios and fixing defects for assigned applications.
- Familiarity with the http://asp.net/">ASP.NET framework, SQL Server and design.
- Hands on experience using C# .NET, MVC, Oops Concept, Jquery, Bootstrap.
- Knowledge of .NET languages
- Familiarity with architecture styles/APIs (REST, RPC)
- Understanding of Agile methodologies
- Strong attention to detail
- Excellent troubleshooting and communication skills
- Knowledge of REST and RPC APIs
- Angular will be an advantage
- Able to work well in a team setting
(Hadoop, HDFS, Kafka, Spark, Hive)
Overall Experience - 8 to 12 years
Relevant exp on Big data - 3+ years in above
Salary: Max up-to 20LPA
Job location - Chennai / Bangalore /
Notice Period - Immediate joiner / 15-to-20-day Max
The Responsibilities of The Senior Data Engineer Are:
- Requirements gathering and assessment
- Breakdown complexity and translate requirements to specification artifacts and story boards to build towards, using a test-driven approach
- Engineer scalable data pipelines using big data technologies including but not limited to Hadoop, HDFS, Kafka, HBase, Elastic
- Implement the pipelines using execution frameworks including but not limited to MapReduce, Spark, Hive, using Java/Scala/Python for application design.
- Mentoring juniors in a dynamic team setting
- Manage stakeholders with proactive communication upholding TheDataTeam's brand and values
A Candidate Must Have the Following Skills:
- Strong problem-solving ability
- Excellent software design and implementation ability
- Exposure and commitment to agile methodologies
- Detail oriented with willingness to proactively own software tasks as well as management tasks, and see them to completion with minimal guidance
- Minimum 8 years of experience
- Should have experience in full life-cycle of one big data application
- Strong understanding of various storage formats (ORC/Parquet/Avro)
- Should have hands on experience in one of the Hadoop distributions (Hortoworks/Cloudera/MapR)
- Experience in at least one cloud environment (GCP/AWS/Azure)
- Should be well versed with at least one database (MySQL/Oracle/MongoDB/Postgres)
- Bachelor's in Computer Science, and preferably, a Masters as well - Should have good code review and debugging skills
Additional skills (Good to have):
- Experience in Containerization (docker/Heroku)
- Exposure to microservices
- Exposure to DevOps practices - Experience in Performance tuning of big data applications








