Cutshort logo

11+ MSTR Jobs in India

Apply to 11+ MSTR Jobs on CutShort.io. Find your next job, effortlessly. Browse MSTR Jobs and apply today!

icon
Latent Bridge Pvt Ltd

at Latent Bridge Pvt Ltd

6 recruiters
Mansoor Khan
Posted by Mansoor Khan
Remote only
3 - 7 yrs
₹5L - ₹20L / yr
MicroStrategy administration
skill iconAmazon Web Services (AWS)
Business Intelligence (BI)
MSTR

Familiar with the MicroStrategy architecture, Admin Certification Preferred

· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes

· Monitor and manage existing Business Intelligence development/production systems

· MicroStrategy installation, upgrade and administration on Windows and Linux platform

· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.

· Analyze application and system logs while troubleshooting and root cause analysis

· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.

· Monitor, report and investigate solutions to improve report performance.

· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.

· Provide support for the platform, report execution and implementation, user community and data investigations.

· Identify improvement areas in Environment hosting and upgrade processes.

· Identify automation opportunities and participate in automation implementations

· Provide on-call support for Business Intelligence issues

· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.

· Familiar with AWS, Linux Scripting

· Knowledge of MSTR Mobile

· Knowledge of capacity planning and system’s scaling needs

Read more
Mumbai
5 - 14 yrs
₹50L - ₹70L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
kubeflow
+8 more

Responsibilities:

  • Data science model review, run the code refactoring and optimization, containerization, deployment, versioning, and monitoring of its quality.
  • Design and implement cloud solutions, build MLOps on the cloud (preferably AWS)
  • Work with workflow orchestration tools like Kubeflow, Airflow, Argo, or similar tools
  • Data science models testing, validation, and test automation.
  • Communicate with a team of data scientists, data engineers, and architects, and document the processes.


Eligibility:

  • Rich hands-on experience in writing object-oriented code using python
  • Min 3 years of MLOps experience (Including model versioning, model and data lineage, monitoring, model hosting and deployment, scalability, orchestration, continuous learning, and Automated pipelines)
  • Understanding of Data Structures, Data Systems, and software architecture
  • Experience in using MLOps frameworks like Kubeflow, MLFlow, and Airflow Pipelines for building, deploying, and managing multi-step ML workflows based on Docker containers and Kubernetes.
  • Exposure to deep learning approaches and modeling frameworks (PyTorch, Tensorflow, Keras, etc. )
Read more
Archwell
Agency job
via AVI Consulting LLP by Sravanthi Puppala
Mysore
2 - 8 yrs
₹1L - ₹15L / yr
Snowflake
skill iconPython
SQL
skill iconAmazon Web Services (AWS)
Windows Azure
+6 more

Title:  Data Engineer – Snowflake

 

Location: Mysore (Hybrid model)

Exp-2-8 yrs

Type: Full Time

Walk-in date: 25th Jan 2023 @Mysore 

 

Job Role: We are looking for an experienced Snowflake developer to join our team as a Data Engineer who will work as part of a team to help design and develop data-driven solutions that deliver insights to the business. The ideal candidate is a data pipeline builder and data wrangler who enjoys building data-driven systems that drive analytical solutions and building them from the ground up. You will be responsible for building and optimizing our data as well as building automated processes for production jobs. You will support our software developers, database architects, data analysts and data scientists on data initiatives

 

Key Roles & Responsibilities:

  • Use advanced complex Snowflake/Python and SQL to extract data from source systems for ingestion into a data pipeline.
  • Design, develop and deploy scalable and efficient data pipelines.
  • Analyze and assemble large, complex datasets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements. For example: automating manual processes, optimizing data delivery, re-designing data platform infrastructure for greater scalability.
  • Build required infrastructure for optimal extraction, loading, and transformation (ELT) of data from various data sources using AWS and Snowflake leveraging Python or SQL technologies.
  • Monitor cloud-based systems and components for availability, performance, reliability, security and efficiency
  • Create and configure appropriate cloud resources to meet the needs of the end users.
  • As needed, document topology, processes, and solution architecture.
  • Share your passion for staying on top of tech trends, experimenting with and learning new technologies

 

Qualifications & Experience

Qualification & Experience Requirements:

  • Bachelor's degree in computer science, computer engineering, or a related field.
  • 2-8 years of experience working with Snowflake
  • 2+ years of experience with the AWS services.
  • Candidate should able to write the stored procedure and function in Snowflake.
  • At least 2 years’ experience in snowflake developer.
  • Strong SQL Knowledge.
  • Data injection in snowflake using Snowflake procedure.
  • ETL Experience is Must (Could be any tool)
  • Candidate should be aware of snowflake architecture.
  • Worked on the Migration project
  • DW Concept (Optional)
  • Experience with cloud data storage and compute components including lambda functions, EC2s, containers.
  • Experience with data pipeline and workflow management tools: Airflow, etc.
  • Experience cleaning, testing, and evaluating data quality from a wide variety of ingestible data sources
  • Experience working with Linux and UNIX environments.
  • Experience with profiling data, with and without data definition documentation
  • Familiar with Git
  • Familiar with issue tracking systems like JIRA (Project Management Tool) or Trello.
  • Experience working in an agile environment.

Desired Skills:

  • Experience in Snowflake. Must be willing to be Snowflake certified in the first 3 months of employment.
  • Experience with a stream-processing system: Snowpipe
  • Working knowledge of AWS or Azure
  • Experience in migrating from on-prem to cloud systems
Read more
Graasai
Vineet A
Posted by Vineet A
Pune
3 - 7 yrs
₹10L - ₹30L / yr
Spark
Hadoop
Big Data
Data engineering
PySpark
+9 more

Graas uses predictive AI to turbo-charge growth for eCommerce businesses. We are “Growth-as-a-Service”. Graas is a technology solution provider using predictive AI to turbo-charge growth for eCommerce businesses. Graas integrates traditional data silos and applies a machine-learning AI engine, acting as an in-house data scientist to predict trends and give real-time insights and actionable recommendations for brands. The platform can also turn insights into action by seamlessly executing these recommendations across marketplace store fronts, brand.coms, social and conversational commerce, performance marketing, inventory management, warehousing, and last mile logistics - all of which impacts a brand’s bottom line, driving profitable growth.


Roles & Responsibilities:

Work on implementation of real-time and batch data pipelines for disparate data sources.

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS technologies.
  • Build and maintain an analytics layer that utilizes the underlying data to generate dashboards and provide actionable insights.
  • Identify improvement areas in the current data system and implement optimizations.
  • Work on specific areas of data governance including metadata management and data quality management.
  • Participate in discussions with Product Management and Business stakeholders to understand functional requirements and interact with other cross-functional teams as needed to develop, test, and release features.
  • Develop Proof-of-Concepts to validate new technology solutions or advancements.
  • Work in an Agile Scrum team and help with planning, scoping and creation of technical solutions for the new product capabilities, through to continuous delivery to production.
  • Work on building intelligent systems using various AI/ML algorithms. 

 

Desired Experience/Skill:

 

  • Must have worked on Analytics Applications involving Data Lakes, Data Warehouses and Reporting Implementations.
  • Experience with private and public cloud architectures with pros/cons.
  • Ability to write robust code in Python and SQL for data processing. Experience in libraries such as Pandas is a must; knowledge of one of the frameworks such as Django or Flask is a plus.
  • Experience in implementing data processing pipelines using AWS services: Kinesis, Lambda, Redshift/Snowflake, RDS.
  • Knowledge of Kafka, Redis is preferred
  • Experience on design and implementation of real-time and batch pipelines. Knowledge of Airflow is preferred.
  • Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn)
Read more
Wowinfobiz
Nargis Mapari
Posted by Nargis Mapari
Mumbai
4 - 6 yrs
₹10L - ₹12L / yr
PL/SQL
SQL Azure
skill iconAmazon Web Services (AWS)
Oracle
SQL
+4 more

 PLSQL  Developer

experience of 4 to 6 years

Skills- MS SQl Server and Oracle, AWS or Azure


•            Experience in setting up RDS service in cloud technologies such as AWS or Azure

•            Strong proficiency with SQL and its variation among popular databases

•            Should be well-versed in writing stored procedures, functions, packages, using collections,

•            Skilled at optimizing large, complicated SQL statements.

•            Should have worked in migration projects.

•            Should have worked on creating reports.

•            Should be able to distinguish between normalized and de-normalized data modelling designs and use cases.

•            Knowledge of best practices when dealing with relational databases

•            Capable of troubleshooting common database issues

•            Familiar with tools that can aid with profiling server resource usage and optimizing it.

•            Proficient understanding of code versioning tools such as Git and SVN


Read more
Saas based product company
Bengaluru (Bangalore), Gurugram
4 - 8 yrs
₹6L - ₹15L / yr
NOC
Computer Networking
Network operations
skill iconAmazon Web Services (AWS)
Windows Azure
+3 more
Hi

We have an opportunity for a Lead Operations Engineer role with our client at Bangalore /Gurgaon. Sharing the JD for your reference. Please revert if you would be interested in this opportunity and we can connect accordingly.

JOB DETAILS

Shift timing: 

9.00AM-6.00PM / 11.00AM -8.00PM / 2.00PM – 11.00PM / 7.00PM -3.00AM IST(Night shift allowance will be provided)

Position

Lead Operations Engineer

Location

Bangalore/ Gurgaon

About Our client

Who we are :

At a time when consumers are connected and empowered like never before,

Our client is helping the world's largest brands provide amazing experiences at every turn. It offers a set of powerful social capabilities that allow our clients to reach, engage, and listen to customers across 24

social channels. We empower entire organizations to work together across social, marketing, advertising,

research, and customer care to manage customer experience at scale. Most exciting, Our client works with 50% of the Fortune 500 and nine of the world's 10 most valued brands, including McDonald's, Nestle, Nike,

P&G, Shell, Samsung, and Visa.

What You'll Do

What You’ll Do As a Lead Operations Engineer at our client, you should be passionate about working on new technologies, high profile projects, and are motivated to deliver solutions on an aggressive schedule.

Candidates from product based companies only.

1. 5-7 years of exposure and working knowledge of data centers on-premise or on AWS/Azure/GCP.

2. Working Experience on Jenkins, Ansible, Git, Release & Deployments

3. Working Experience on ELK, Mongo, Kafka, Kubernetes.

4. Implement and operate SaaS environments hosting multiple applications and provide production support.

5. Contribute to automation and provisioning of environments.

6. Strong Linux systems administration skills with RHCE/Centos.

7. Have scripting knowledge in one of the following – Python/Bash/Perl.

8. Good knowledge on Gradle, Maven, etc

9. Should have knowledge of service monitoring via Nagios, Sensu, etc

10. Good to have knowledge on setting up and deploying application servers .

11. Mentoring Team members
Read more
Angel One

at Angel One

4 recruiters
Shriya Tak
Posted by Shriya Tak
Remote only
3 - 5 yrs
₹5L - ₹10L / yr
SQL
SQL server
PL/SQL
Microsoft SQL Server Data Tools
SQL Developer
+3 more
  • You would be responsible for Developing SQL Databases, automation of business reports and Developing Business Dashboard using BI Tools.
  • Development of high quality database solutions 
  • Create complex functions, scripts, stored procedures and triggers to support application development.  
  • Develop, implement and optimize stored procedures and functions using T-SQL
  • Review and interpret ongoing business report requirements
  • Build appropriate and useful reporting deliverables
  • Analyze existing SQL queries for performance improvements
  • Fix any issues related to database performance and provide corrective measures.  
  • MIS Automation
  • Provide timely scheduled management reporting
  • Create Business dashboard using Tableau BI Tool


Requirement / Desired Skills:

  • 5+ years of business Analyst / SQL developer experience in Fintech,internet,consulting etc industry
  • Excellent understanding of T-SQL programming
  • Excellent understanding of Microsoft SQL Server 
  • Good knowledge of HTML and JavaScript 
  • SQL Server Reporting Services and SQL Server Analysis Services
  • Knowledge of any of the BI Tool (Tableau, PowerBI or Qlikview)
Read more
TartanHQ Solutions Private Limited
Prabhat Shobha
Posted by Prabhat Shobha
Bengaluru (Bangalore)
2 - 4 yrs
₹9L - ₹15L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
skill iconPython
+4 more

Key deliverables for the Data Science Engineer would be to help us discover the information hidden in vast amounts of data, and help us make smarter decisions to deliver even better products. Your primary focus will be on applying data mining techniques, doing statistical analysis, and building high-quality prediction systems integrated with our products.

What will you do?

  • You will be building and deploying ML models to solve specific business problems related to NLP, computer vision, and fraud detection.
  • You will be constantly assessing and improving the model using techniques like Transfer learning
  • You will identify valuable data sources and automate collection processes along with undertaking pre-processing of structured and unstructured data
  • You will own the complete ML pipeline - data gathering/labeling, cleaning, storage, modeling, training/testing, and deployment.
  • Assessing the effectiveness and accuracy of new data sources and data gathering techniques.
  • Building predictive models and machine-learning algorithms to apply to data sets.
  • Coordinate with different functional teams to implement models and monitor outcomes.
  •  
  • Presenting information using data visualization techniques and proposing solutions and strategies to business challenges


We would love to hear from you if :

  • You have 2+ years of experience as a software engineer at a SaaS or technology company
  • Demonstrable hands-on programming experience with Python/R Data Science Stack
  • Ability to design and implement workflows of Linear and Logistic Regression, Ensemble Models (Random Forest, Boosting) using R/Python
  • Familiarity with Big Data Platforms (Databricks, Hadoop, Hive), AWS Services (AWS, Sagemaker, IAM, S3, Lambda Functions, Redshift, Elasticsearch)
  • Experience in Probability and Statistics, ability to use ideas of Data Distributions, Hypothesis Testing and other Statistical Tests.
  • Demonstrable competency in Data Visualisation using the Python/R Data Science Stack.
  • Preferable Experience Experienced in web crawling and data scraping
  • Strong experience in NLP. Worked on libraries such as NLTK, Spacy, Pattern, Gensim etc.
  • Experience with text mining, pattern matching and fuzzy matching

Why Tartan?
  • Brand new Macbook
  • Stock Options
  • Health Insurance
  • Unlimited Sick Leaves
  • Passion Fund (Invest in yourself or your passion project)
  • Wind Down
Read more
Saas Based Tech Company
Noida
3 - 9 yrs
₹15L - ₹18L / yr
skill iconAmazon Web Services (AWS)
AWS CloudFormation
DevOps
skill iconDocker
Hello

Role –Infra Management Tech Consultant

 

This is for you, if you:

  • Love technology and have an innate interest in coding and building products.
  • Aren’t afraid of taking ownership and accountability. You love challenges and won’t stop till you haven’t found a solution.
  • Possess knowledge of Software development industry best practices.
  • Have hands-on experience and expertise on the technical competencies needed for the job, so that you can work independently.
  • Are passionate about the travel/hospitality space and can understand the nuances of hotel operations.
  • Know how to have fun and smile often

 

Certifications Required

  • AWS DevOps Engineer (professional) - equivalent or higher
  • AWS Solution Architect (professional) - equivalent or higher
  • AWS Developer (associate) - equivalent or higher

 

What will you do?

  • Responsible for infrastructure for all engineering activities, manage and oversee.
  • Creation & maintenance of Dev, QA, UAT, CS, Sales, Prod (cloud + private).
  • Backup & Restore underlying resources for failover.
  • Manage connectivity, access & security for all environments.
  • Upgrade environment for OS / software / service updates, certificates & patches.
  • Upgrade environments for architecture upgrades.
  • Upgrade environments for cost optimisations.
  • Perform Release activity for all environments.
  • Monitor (& setup alerts for) underlying resources for degradation of services.
  • Automate as many infra activities as possible.
  • Ensure Apps are healthy - log rotation, alarms, failover, scaling, recovery.
  • Ensure DBs are healthy - size, indexing, fragmentation, failover, scaling, recovery.
  • Assist in debugging production issues, engage with Development team to identify & apply long term fixes.
  • Explore source code, databases, logs, and traces to find or create solution.

 

Technical Competencies you’ll possess:

  • E./B.Tech/MCA or equivalent
  • 3+ years of meaningful work experience in Dev ops handling complex services
  • AWS DevOps Engineer/Architect/Developer certification
  • Expertise with AWS services including but not limited to EC2, ECS, S3, RDS,Lambda,VPC,OpsWork,CloudFront,Route53,CodeDeploy,SQS,SNS.
  • Hands on experience in maintaining production databases - including creatingqueries for identifying bottlenecks, creating/maintaining indexes where required, de-fragmenting db
  • Handsonexperience(andstrongunderstanding)onLinux&WindowsbasedOS
  • Expertise with Docker and related tools
  • HandsonexperiencewithinfrastructureascodetoolslikeAWSCloudFormation/Terraformandconfigurationmanagementtoolslike PuppetorChef
  • Strong grasp of a modern stack protocols / technologies like –
    • Headers, Caching
    • IP/TCP/HTTP(S), WebSockets, SSL/TLS
    • CDNs, DNS, proxies
  • Expertise in setting-up and maintaining a modern stack like (on AWS & cloud)
    • Version Control Systems like TFS, SVN, Gitlab servers, etc
    • Application & proxy servers like Nginx, HAproxy, IIS, etc
    • Database servers like MSSQL, MySQL, Postgres, Mongo, Redis, Elasticsearchetc
    • Monitoring tools like Grafana, Zabbix, Influx, Prometheus, etc
  • Experience in Python coding preferred
  • Good understanding and experience in continuous integration/continuous deployment tools

Regards
Team Merito
Read more
Streetmark
Agency job
via STREETMARK Info Solutions by Mohan Guttula
Remote, Bengaluru (Bangalore), Chennai
3 - 9 yrs
₹3L - ₹20L / yr
SCCM
PL/SQL
APPV
Stani's Python Editor
AWS Simple Notification Service (SNS)
+3 more

Hi All,

We are hiring Data Engineer for one of our client for Bangalore & Chennai Location.


Strong Knowledge of SCCM, App V, and Intune infrastructure.

Powershell/VBScript/Python,

Windows Installer

Knowledge of Windows 10 registry

Application Repackaging

Application Sequencing with App-v

Deploying and troubleshooting applications, packages, and Task Sequences.

Security patch deployment and remediation

Windows operating system patching and defender updates

 

Thanks,
Mohan.G

Read more
Catalyst IQ

at Catalyst IQ

6 recruiters
Sidharth Maholia
Posted by Sidharth Maholia
Bengaluru (Bangalore)
7 - 10 yrs
₹20L - ₹30L / yr
Business Planning
Business requirements
Business operations
Business Intelligence (BI)
Business growth
Responsibilities
 Identify key metrics for business growth
 Map various business processes, systems to drive key metrics
 Design and implement business plans, processes to drive business growth
 Set comprehensive goals for performance and growth
 Oversee daily operations of the company and the work of executives (IT, Marketing,
Sales, Finance etc.), with respect to implementation of processes
 Evaluate performance by analyzing and interpreting data and metrics
 Write and submit reports to the CEO in all matters of importance

Requirements
 Proven experience as a growth leader in identifying key metrics and underlying process
for business growth
 Demonstrable competency in driving process implementation
 Understanding of business functions such as HR, Finance, marketing etc.
 Working knowledge of data analysis and performance/operation metrics
 Working knowledge of budgeting, sales, business development, and strategic planning.
 Agri Industry background is preferable
 Outstanding organizational and leadership abilities
 Excellent interpersonal and public speaking skills
 Aptitude in decision-making and problem-solving
 MBA from a Tier 1/2 University

Personal Attributes
 Proven leadership ability.
 Ability to set and manage priorities judiciously.
 Excellent written and oral communication skills.
 Excellent interpersonal skills.
 Ability to articulate ideas to both technical and non-technical audiences.
 Exceptionally self-motivated and directed.
 Keen attention to detail.
 Superior analytical, evaluative, and problem-solving abilities.
 Exceptional service orientation.
 Ability to motivate in a team-oriented, collaborative environment.
Read more
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Find more jobs
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort