Cutshort logo

11+ SFTP Jobs in Mumbai | SFTP Job openings in Mumbai

Apply to 11+ SFTP Jobs in Mumbai on CutShort.io. Explore the latest SFTP Job opportunities across top companies like Google, Amazon & Adobe.

icon
Mumbai
5 - 8 yrs
₹25L - ₹30L / yr
SQL Azure
ADF
Azure data factory
Azure Datalake
Azure Databricks
+13 more
As a hands-on Data Architect, you will be part of a team responsible for building enterprise-grade
Data Warehouse and Analytics solutions that aggregate data across diverse sources and data types
including text, video and audio through to live stream and IoT in an agile project delivery
environment with a focus on DataOps and Data Observability. You will work with Azure SQL
Databases, Synapse Analytics, Azure Data Factory, Azure Datalake Gen2, Azure Databricks, Azure
Machine Learning, Azure Service Bus, Azure Serverless (LogicApps, FunctionApps), Azure Data
Catalogue and Purview among other tools, gaining opportunities to learn some of the most
advanced and innovative techniques in the cloud data space.
You will be building Power BI based analytics solutions to provide actionable insights into customer
data, and to measure operational efficiencies and other key business performance metrics.
You will be involved in the development, build, deployment, and testing of customer solutions, with
responsibility for the design, implementation and documentation of the technical aspects, including
integration to ensure the solution meets customer requirements. You will be working closely with
fellow architects, engineers, analysts, and team leads and project managers to plan, build and roll
out data driven solutions
Expertise:
Proven expertise in developing data solutions with Azure SQL Server and Azure SQL Data Warehouse (now
Synapse Analytics)
Demonstrated expertise of data modelling and data warehouse methodologies and best practices.
Ability to write efficient data pipelines for ETL using Azure Data Factory or equivalent tools.
Integration of data feeds utilising both structured (ex XML/JSON) and flat schemas (ex CSV,TXT,XLSX)
across a wide range of electronic delivery mechanisms (API/SFTP/etc )
Azure DevOps knowledge essential for CI/CD of data ingestion pipelines and integrations.
Experience with object-oriented/object function scripting languages such as Python, Java, JavaScript, C#,
Scala, etc is required.
Expertise in creating technical and Architecture documentation (ex: HLD/LLD) is a must.
Proven ability to rapidly analyse and design solution architecture in client proposals is an added advantage.
Expertise with big data tools: Hadoop, Spark, Kafka, NoSQL databases, stream-processing systems is a plus.
Essential Experience:
5 or more years of hands-on experience in a data architect role with the development of ingestion,
integration, data auditing, reporting, and testing with Azure SQL tech stack.
full data and analytics project lifecycle experience (including costing and cost management of data
solutions) in Azure PaaS environment is essential.
Microsoft Azure and Data Certifications, at least fundamentals, are a must.
Experience using agile development methodologies, version control systems and repositories is a must.
A good, applied understanding of the end-to-end data process development life cycle.
A good working knowledge of data warehouse methodology using Azure SQL.
A good working knowledge of the Azure platform, it’s components, and the ability to leverage it’s
resources to implement solutions is a must.
Experience working in the Public sector or in an organisation servicing Public sector is a must,
Ability to work to demanding deadlines, keep momentum and deal with conflicting priorities in an
environment undergoing a programme of transformational change.
The ability to contribute and adhere to standards, have excellent attention to detail and be strongly driven
by quality.
Desirables:
Experience with AWS or google cloud platforms will be an added advantage.
Experience with Azure ML services will be an added advantage Personal Attributes
Articulated and clear in communications to mixed audiences- in writing, through presentations and one-toone.
Ability to present highly technical concepts and ideas in a business-friendly language.
Ability to effectively prioritise and execute tasks in a high-pressure environment.
Calm and adaptable in the face of ambiguity and in a fast-paced, quick-changing environment
Extensive experience working in a team-oriented, collaborative environment as well as working
independently.
Comfortable with multi project multi-tasking consulting Data Architect lifestyle
Excellent interpersonal skills with teams and building trust with clients
Ability to support and work with cross-functional teams in a dynamic environment.
A passion for achieving business transformation; the ability to energise and excite those you work with
Initiative; the ability to work flexibly in a team, working comfortably without direct supervision.
Read more
Fatakpay

at Fatakpay

2 recruiters
Ajit Kumar
Posted by Ajit Kumar
Mumbai
2 - 4 yrs
₹8L - ₹12L / yr
SQL
skill iconPython
Problem solving
Data Warehouse (DWH)
Excel VBA

1. Bridging the gap between IT and the business using data analytics to assess processes, determine requirements and deliver data-driven recommendations and reports to executives and stakeholders.


2. Ability to search, extract, transform and load data from various databases, cleanse and refine data until it is fit-for-purpose


3. Work within various time constraints to meet critical business needs, while measuring and identifying activities performed and ensuring service requirements are met


4. Prioritization of issues to meet deadlines while ensuring high-quality delivery


5. Ability to pull data and to perform ad hoc reporting and analysis as needed


6. Ability to adapt quickly to new and changing technical environments as well as strong analytical and problem-solving abilities


7. Strong interpersonal and presentation skills


SKILLS:


1. Advanced skills in designing reporting interfaces and interactive dashboards in Google Sheets and Excel


2. Experience working with senior decision-makers


3. Strong advanced SQL/MySQL and Python skills with the ability to fetch data from the Data Warehouse as per the stakeholder's requirement


4. Good Knowledge and experience in Excel VBA and advanced excel


5. Good Experience in building Tableau analytical Dashboards as per the stake holder's reporting requirements


6. Strong communication/interpersonal skills


PERSONA:


1. Experience in working on adhoc requirements


2. Ability to toggle around with shifting priorities


3. Experience in working for Fintech or E-commerce industry is preferable


4. Engineering 2+ years of experience as a Business Analyst for the finance processes

Read more
LogiNext

at LogiNext

1 video
7 recruiters
Rakhi Daga
Posted by Rakhi Daga
Mumbai
4 - 7 yrs
₹12L - ₹19L / yr
skill iconMachine Learning (ML)
skill iconData Science
skill iconPHP
skill iconJava
Spark
+1 more

LogiNext is looking for a technically savvy and passionate Senior Software Engineer - Data Science to analyze large amounts of raw information to find patterns that will help improve our company. We will rely on you to build data products to extract valuable business insights.

In this role, you should be highly analytical with a knack for analysis, math and statistics. Critical thinking and problem-solving skills are essential for interpreting data. We also want to see a passion for machine-learning and research.

Your goal will be to help our company analyze trends to make better decisions. Without knowledge of how the software works, data scientists might have difficulty in work. Apart from experience in developing R and Python, they must know modern approaches to software development and their impact. DevOps continuous integration and deployment, experience in cloud computing are everyday skills to manage and process data.

Responsibilities :

Adapting and enhancing machine learning techniques based on physical intuition about the domain Design sampling methodology, prepare data, including data cleaning, univariate analysis, missing value imputation, , identify appropriate analytic and statistical methodology, develop predictive models and document process and results Lead projects both as a principal investigator and project manager, responsible for meeting project requirements on schedule and on budget Coordinate and lead efforts to innovate by deriving insights from heterogeneous sets of data generated by our suite of Aerospace products Support and mentor data scientists Maintain and work with our data pipeline that transfers and processes several terabytes of data using Spark, Scala, Python, Apache Kafka, Pig/Hive & Impala Work directly with application teams/partners (internal clients such as Xbox, Skype, Office) to understand their offerings/domain and help them become successful with data so they can run controlled experiments (a/b testing) Understand the data generated by experiments, and producing actionable, trustworthy conclusions from them Apply data analysis, data mining and data processing to present data clearly and develop experiments (ab testing) Work with development team to build tools for data logging and repeatable data tasks tol accelerate and automate data scientist duties


Requirements:

Bachelor’s or Master’s degree in Computer Science, Math, Physics, Engineering, Statistics or other technical field. PhD preferred 4 to 7 years of experience in data mining, data modeling, and reporting 3+ years of experience working with large data sets or do large scale quantitative analysis Expert SQL scripting required Development experience in one of the following: Scala, Java, Python, Perl, PHP, C++ or C# Experience working with Hadoop, Pig/Hive, Spark, MapReduce Ability to drive projects Basic understanding of statistics – hypothesis testing, p-values, confidence intervals, regression, classification, and optimization are core lingo Analysis - Should be able to perform Exploratory Data Analysis and get actionable insights from the data, with impressive visualization. Modeling - Should be familiar with ML concepts and algorithms; understanding of the internals and pros/cons of models is required. Strong algorithmic problem-solving skills Experience manipulating large data sets through statistical software (ex. R, SAS) or other methods Superior verbal, visual and written communication skills to educate and work with cross functional teams on controlled experiments Experimentation design or A/B testing experience is preferred. Experince in team management.

Read more
Jio Platforms Limited

at Jio Platforms Limited

3 recruiters
Dixit Nahar
Posted by Dixit Nahar
Navi Mumbai, Mumbai
3 - 5 yrs
₹9L - ₹15L / yr
skill iconPython
TensorFlow
Keras
Apache Kafka
Spark
+7 more
Role - Data Scientist / Machine Learning Scientist / Deep Learning Engineer -: 3 - 5 yrs experienced Programming Must Know:- Python, Tensorflow, Keras, Kafka, Spark Must have worked in Video Analytics with at least 2 Deep Learning Models like R-CNN, LSTM, Object Detection Models like YOLO, Object tracking Models like Deep SORT Must have good model training and testing experience with Structured(statistical machine learning) and Unstructured Data Must be good with Statistics. Good to have Data visualization experience in Python or any data visualization tool. Good to have Kubernetes, Multiprocessing experience, MLops like docker, hydra etc

Team:- We are a team of 9 data scientists working on Video Analytics Projects, Data Analytics projects for internal AI requirements of Reliance Industries as well for the external business. At a time, we make progress on multiple projects(atleast 4) in Video Analytics or Data Analytics.
Read more
Virtusa

at Virtusa

2 recruiters
Agency job
via Response Informatics by Anupama Lavanya Uppala
Chennai, Bengaluru (Bangalore), Mumbai, Hyderabad, Pune
3 - 10 yrs
₹10L - ₹25L / yr
PySpark
skill iconPython
  • Minimum 1 years of relevant experience, in PySpark (mandatory)
  • Hands on experience in development, test, deploy, maintain and improving data integration pipeline in AWS cloud environment is added plus 
  • Ability to play lead role and independently manage 3-5 member of Pyspark development team 
  • EMR ,Python and PYspark mandate.
  • Knowledge and awareness working with AWS Cloud technologies like Apache Spark, , Glue, Kafka, Kinesis, and Lambda in S3, Redshift, RDS
Read more
Billeasy

at Billeasy

1 recruiter
Kayum Ansari
Posted by Kayum Ansari
Mumbai
5 - 7 yrs
₹5L - ₹9L / yr
skill iconPython
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
recommendation algorithm

Job Details:-

Designation - Data Scientist

Urgently required. (NP of maximum 15 days)

Location:- Mumbai

Experience:- 5-7 years.

Package Offered:- Rs.5,00,000/- to Rs.9,00,000/- pa.

 

Data Scientist

 

Job Description:-

Responsibilities:

  • Identify valuable data sources and automate collection processes
  • Undertake preprocessing of structured and unstructured data
  • Analyze large amounts of information to discover trends and patterns
  • Build predictive models and machine-learning algorithms
  • Combine models through ensemble modeling
  • Present information using data visualization techniques
  • Propose solutions and strategies to business challenges
  • Collaborate with engineering and product development teams

 

Requirements:

  • Proven experience as a Data Scientist or Data Analyst
  • Experience in data mining
  • Understanding of machine-learning and operations research
  • Knowledge of R, SQL and Python; familiarity with Scala, Java is an asset
  • Experience using business intelligence tools (e.g. Tableau) and data frameworks (e.g. Hadoop)
  • Analytical mind and business acumen
  • Strong math skills (e.g. statistics, algebra)
  • Problem-solving aptitude
  • Excellent communication and presentation skills
  • BSc/BA in Computer Science, Engineering or relevant field; graduate degree in Data Science or other quantitative field is preferred
Read more
Mumbai
5 - 7 yrs
₹20L - ₹25L / yr
AWS KINESYS
Data engineering
AWS Lambda
DynamoDB
data pipeline
+11 more
  • Key responsibility is to design & develop a data pipeline for real-time data integration, processing, executing of the model (if required), and exposing output via MQ / API / No-SQL DB for consumption
  • Provide technical expertise to design efficient data ingestion solutions to store & process unstructured data, such as Documents, audio, images, weblogs, etc
  • Developing API services to provide data as a service
  • Prototyping Solutions for complex data processing problems using AWS cloud-native solutions
  • Implementing automated Audit & Quality assurance Checks in Data Pipeline
  • Document & maintain data lineage from various sources to enable data governance
  • Coordination with BIU, IT, and other stakeholders to provide best-in-class data pipeline solutions, exposing data via APIs, loading in down streams, No-SQL Databases, etc

Skills

  • Programming experience using Python & SQL
  • Extensive working experience in Data Engineering projects, using AWS Kinesys,  AWS S3, DynamoDB, EMR, Lambda, Athena, etc for event processing
  • Experience & expertise in implementing complex data pipeline
  • Strong Familiarity with AWS Toolset for Storage & Processing. Able to recommend the right tools/solutions available to address specific data processing problems
  • Hands-on experience in Unstructured (Audio, Image, Documents, Weblogs, etc) Data processing.
  • Good analytical skills with the ability to synthesize data to design and deliver meaningful information
  • Know-how on any No-SQL DB (DynamoDB, MongoDB, CosmosDB, etc) will be an advantage.
  • Ability to understand business functionality, processes, and flows
  • Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently

Functional knowledge

  • Real-time Event Processing
  • Data Governance & Quality assurance
  • Containerized deployment
  • Linux
  • Unstructured Data Processing
  • AWS Toolsets for Storage & Processing
  • Data Security

 

Read more
upGrad

at upGrad

1 video
19 recruiters
Priyanka Muralidharan
Posted by Priyanka Muralidharan
Bengaluru (Bangalore), Mumbai
2 - 5 yrs
₹14L - ₹20L / yr
product analyst
skill iconData Analytics
skill iconPython
SQL
Tableau
 

About Us

upGrad is an online education platform building the careers of tomorrow by offering the most industry-relevant programs in an immersive learning experience. Our mission is to create a new digital-first learning experience to deliver tangible career impact to individuals at scale. upGrad currently offers programs in Data Science, Machine Learning, Product Management, Digital Marketing, and Entrepreneurship, etc. upGrad is looking for people passionate about management and education to help design learning programs for working professionals to stay sharp and stay relevant and help build the careers of tomorrow.

  • upGrad was awarded the Best Tech for Education by IAMAI for 2018-19

  • upGrad was also ranked as one of the LinkedIn Top Startups 2018: The 25 most

    sought-after startups in India

  • upGrad was earlier selected as one of the top ten most innovative companies in India

    by FastCompany.

  • We were also covered by the Financial Times along with other disruptors in Ed-Tech

  • upGrad is the official education partner for Government of India - Startup India

    program

  • Our program with IIIT B has been ranked #1 program in the country in the domain of

    Artificial Intelligence and Machine Learning

    Role Summary

    We Are looking for an analytically inclined , Insights Driven Data Analyst to make our organisation more data driven. In this role you will be responsible for creating dashboards to drive insights for product and business teams. Be it Day to Day decisions as well as long term impact assessment, Measuring the Efficacy of different products or certain teams, You'll be Empowering each of them. The growing nature of the team will require you to be in touch with all of the teams at upgrad. Are you the "Go To" person everyone looks at for getting Data, Then this role is for you.

    Roles & Responsibilities

    • Lead and own the analysis of highly complex data sources, identifying trends and patterns in data and provide insights/recommendations based on analysis results

    • Build, maintain, own and communicate detailed reports to assist Marketing, Growth/Learning Experience and Other Business/Executive Teams

    • Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.

    • Analyze data and generate insights in the form of user analysis, user segmentation, performance reports, etc.

    • Facilitate review sessions with management, business users and other team members

    • Design and create visualizations to present actionable insights related to data sets and business questions at hand

    • Develop intelligent models around channel performance, user profiling, and personalization

      Skills Required

      • Having 3-5 yrs hands-on experience with Product related analytics and reporting

      • Experience with building dashboards in Tableau or other data visualization tools

        such as D3

      • Strong data, statistics, and analytical skills with a good grasp of SQL.

      • Programming experience in Python is must

      • Comfortable managing large data sets

      • Good Excel/data management skills

Read more
PAGO Analytics India Pvt Ltd
Vijay Cheripally
Posted by Vijay Cheripally
Remote, Bengaluru (Bangalore), Mumbai, NCR (Delhi | Gurgaon | Noida)
2 - 8 yrs
₹8L - ₹15L / yr
skill iconPython
PySpark
Microsoft Windows Azure
SQL Azure
skill iconData Analytics
+6 more
Be an integral part of large scale client business development and delivery engagements
Develop the software and systems needed for end-to-end execution on large projects
Work across all phases of SDLC, and use Software Engineering principles to build scaled solutions
Build the knowledge base required to deliver increasingly complex technology projects


Object-oriented languages (e.g. Python, PySpark, Java, C#, C++ ) and frameworks (e.g. J2EE or .NET)
Database programming using any flavours of SQL
Expertise in relational and dimensional modelling, including big data technologies
Exposure across all the SDLC process, including testing and deployment
Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc.
Good knowledge of Python and Spark are required
Good understanding of how to enable analytics using cloud technology and ML Ops
Experience in Azure Infrastructure and Azure Dev Ops will be a strong plus
Read more
CRMnext

at CRMnext

2 recruiters
Deepak Sharma
Posted by Deepak Sharma
Mumbai
2 - 4.6 yrs
₹3L - ₹6L / yr
skill iconPython
Big Data
skill iconData Science
skill iconR Programming

We are Still Hiring!!!


Dear Candidate,


This email is regarding open positions for Data Engineer Professionals with our organisation CRMNext.


In case, you find the company profile and JD matching your aspirations and your profile matches the required Skill and qualifications criteria, please share your updated resume with response to questions.


We shall reach you back for scheduling the interviews post this.


About Company:


Driven by a Passion for Excellence


Acidaes Solutions Pvt. Ltd. is a fast growing specialist Customer Relationship Management (CRM) product IT company providing ultra-scalable CRM solutions. It offers CRMNEXT, our flagship and award winning CRM platform to leading enterprises both on cloud as well as on-premise models. We consistently focus on using the state of art technology solutions to provide leading product capabilities to our customers.


CRMNEXT is a global cloud CRM solution provider credited with the world's largest installation ever. From Fortune 500 to start-ups, businesses across nine verticals have built profitable customer relationships via CRMNEXT. A pioneer of Digital CRM for some of the largest enterprises across Asia-Pacific, CRMNEXT's customers include global brands like Pfizer, HDFC Bank, ICICI Bank, Axis Bank, Tata AIA, Reliance, National Bank of Oman, Pavers England etc. It was recently lauded in the Gartner Magic Quadrant 2015 for Lead management, Sales Force Automation and Customer Engagement. For more information, visit us at www.crmnext.com


Educational Qualification:


B.E./B.Tech /M.E./ M.Tech/ MCA with (Bsc.IT/Bsc. Comp/BCA is mandatory)


60% in Xth, XIIth /diploma, B.E./B.Tech/M.E/M.Tech/ MCA with (Bsc.IT/Bsc. Comp/BCA is mandatory)


All education should be regular (Please Note - Degrees through Distance learning/correspondence will not consider)


Exp level- 2 to 5 yrs


Location-Andheri (Mumbai)


Technical expertise required:


1)Analytics experience in the BFSI domain is must


2) Hands on technical experience in python, big data and AI


3) Understanding of datamodels and analytical concepts


4) Client engagement :


Should have run in past client engagements for Big data/ AI projects starting from requirement gathering, to planning development sprints, and delivery


Should have experience in deploying big data and AI projects


First hand experience on data governance, data quality, customer data models, industry data models


Aware of SDLC.

 


Regards,
Deepak Sharma
HR Team

Read more
AthenasOwl

at AthenasOwl

1 video
1 recruiter
Ericsson Fernandes
Posted by Ericsson Fernandes
Mumbai
3 - 7 yrs
₹10L - ₹20L / yr
skill iconDeep Learning
Natural Language Processing (NLP)
skill iconMachine Learning (ML)
Computer vision
skill iconPython
+1 more

Company Profile and Job Description  

About us:  

AthenasOwl (AO) is our “AI for Media” solution that helps content creators and broadcasters to create and curate smarter content. We launched the product in 2017 as an AI-powered suite meant for the media and entertainment industry. Clients use AthenaOwl's context adapted technology for redesigning content, taking better targeting decisions, automating hours of post-production work and monetizing massive content libraries.  

For more details visit: www.athenasowl.tv   

  

Role:   

Senior Machine Learning Engineer  

Experience Level:   

4 -6 Years of experience  

Work location:   

Mumbai (Malad W)   

  

Responsibilities:   

  • Develop cutting edge machine learning solutions at scale to solve computer vision problems in the domain of media, entertainment and sports
  • Collaborate with media houses and broadcasters across the globe to solve niche problems in the field of post-production, archiving and viewership
  • Manage a team of highly motivated engineers to deliver high-impact solutions quickly and at scale

 

 

The ideal candidate should have:   

  • Strong programming skills in any one or more programming languages like Python and C/C++
  • Sound fundamentals of data structures, algorithms and object-oriented programming
  • Hands-on experience with any one popular deep learning framework like TensorFlow, PyTorch, etc.
  • Experience in implementing Deep Learning Solutions (Computer Vision, NLP etc.)
  • Ability to quickly learn and communicate the latest findings in AI research
  • Creative thinking for leveraging machine learning to build end-to-end intelligent software systems
  • A pleasantly forceful personality and charismatic communication style
  • Someone who will raise the average effectiveness of the team and has demonstrated exceptional abilities in some area of their life. In short, we are looking for a “Difference Maker”

 

Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort