Cutshort logo
Latent Bridge Pvt Ltd logo
Micro strategy Admin
Latent Bridge Pvt Ltd's logo

Micro strategy Admin

Mansoor Khan's profile picture
Posted by Mansoor Khan
3 - 7 yrs
₹5L - ₹20L / yr
Remote only
Skills
MicroStrategy administration
skill iconAmazon Web Services (AWS)
Business Intelligence (BI)
MSTR

Familiar with the MicroStrategy architecture, Admin Certification Preferred

· Familiar with administrative functions, using Object Manager, Command Manager, installation/configuration of MSTR in clustered architecture, applying patches, hot-fixes

· Monitor and manage existing Business Intelligence development/production systems

· MicroStrategy installation, upgrade and administration on Windows and Linux platform

· Ability to support and administer multi-tenant MicroStrategy infrastructure including server security troubleshooting and general system maintenance.

· Analyze application and system logs while troubleshooting and root cause analysis

· Work on operations like deploy and manage packages, User Management, Schedule Management, Governing Settings best practices, database instance and security configuration.

· Monitor, report and investigate solutions to improve report performance.

· Continuously improve the platform through tuning, optimization, governance, automation, and troubleshooting.

· Provide support for the platform, report execution and implementation, user community and data investigations.

· Identify improvement areas in Environment hosting and upgrade processes.

· Identify automation opportunities and participate in automation implementations

· Provide on-call support for Business Intelligence issues

· Experience of working on MSTR 2021, MSTR 2021 including knowledge of working on Enterprise Manager and new features like Platform Analytics, Hyper Intelligence, Collaboration, MSTR Library, etc.

· Familiar with AWS, Linux Scripting

· Knowledge of MSTR Mobile

· Knowledge of capacity planning and system’s scaling needs

Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Latent Bridge Pvt Ltd

Founded :
2019
Type
Size :
100-1000
Stage :
Bootstrapped
About
ALBAI™ platform delivers AI driven intelligent automation solutions in a flexible, scalable and easy to implement manner with zero capital costs, giving clients a one-stop shop for all IA software needs, in a vendor-agnostic manner
Read more
Connect with the team
Profile picture
Danish Hasan
Profile picture
Shweta Kapoor
Profile picture
Priyank Sain
Profile picture
Mansoor Khan
Profile picture
Palak Pal
Profile picture
Mansoor Khan
Company social profiles
bloglinkedintwitter

Similar jobs

Databook
at Databook
5 candid answers
1 video
Nikhil Mohite
Posted by Nikhil Mohite
Mumbai
1 - 3 yrs
Upto ₹20L / yr (Varies
)
Data engineering
skill iconPython
Apache Kafka
Spark
skill iconAmazon Web Services (AWS)
+1 more

Lightning Job By Cutshort ⚡

 

As part of this feature, you can expect status updates about your application and replies within 72 hours (once the screening questions are answered)

 

 

About Databook:-

- Great salespeople let their customers’ strategies do the talking.

 

Databook’s award-winning Strategic Relationship Management (SRM) platform uses advanced AI and NLP to empower the world’s largest B2B sales teams to create, manage, and maintain strategic relationships at scale. The platform ingests and interprets billions of financial and market data signals to generate actionable sales strategies that connect the seller’s solutions to a buyer’s financial pain and urgency.

 

The Opportunity

We're seeking Junior Engineers to support and develop Databook’s capabilities. Working closely with our seasoned engineers, you'll contribute to crafting new features and ensuring our platform's reliability. If you're eager about playing a part in building the future of customer intelligence, with a keen eye towards quality, we'd love to meet you!

 

Specifically, you'll

- Participate in various stages of the engineering lifecycle alongside our experienced engineers.

- Assist in maintaining and enhancing features of the Databook platform.

- Collaborate with various teams to comprehend requirements and aid in implementing technology solutions.

 

Please note: As you progress and grow with us, you might be introduced to on-call rotations to handle any platform challenges.

 

Working Arrangements:

- This position offers a hybrid work mode, allowing employees to work both remotely and in-office as mutually agreed upon.

 

What we're looking for

- 1-2+ years experience as a Data Engineer

- Bachelor's degree in Engineering

- Willingness to work across different time zones

- Ability to work independently

- Knowledge of cloud (AWS or Azure)

- Exposure to distributed systems such as Spark, Flink or Kafka

- Fundamental knowledge of data modeling and optimizations

- Minimum of one year of experience using Python working as a Software Engineer

- Knowledge of SQL (Postgres) databases would be beneficial

- Experience with building analytics dashboard

- Familiarity with RESTful APIs and/or GraphQL is welcomed

- Hand-on experience with Numpy, Pandas, SpaCY would be a plus

- Exposure or working experience on GenAI (LLMs in general), LLMOps would be a plus

- Highly fluent in both spoken and written English language

 

Ideal candidates will also have:

- Self-motivated with great organizational skills.

- Ability to focus on small and subtle details.

- Are willing to learn and adapt in a rapidly changing environment.

- Excellent written and oral communication skills.

 

Join us and enjoy these perks!

- Competitive salary with bonus

- Medical insurance coverage

- 5 weeks leave plus public holidays

- Employee referral bonus program

- Annual learning stipend to spend on books, courses or other training materials that help you develop skills relevant to your role or professional development

- Complimentary subscription to Masterclass

Read more
Acuity Knowledge Partners
at Acuity Knowledge Partners
2 candid answers
1 video
Gangadhar S
Posted by Gangadhar S
Bengaluru (Bangalore)
4 - 9 yrs
₹16L - ₹40L / yr
skill iconPython
skill iconAmazon Web Services (AWS)
CI/CD
skill iconMongoDB
MLOps
+1 more

Job Responsibilities:

1. Develop/debug applications using Python.

2. Improve code quality and code coverage for existing or new program.

3. Deploy and Integrate the Machine Learning models.

4. Test and validate the deployments.

5. ML Ops function.


Technical Skills

1. Graduate in Engineering or Technology with strong academic credentials

2. 4 to 8 years of experience as a Python developer.

3. Excellent understanding of SDLC processes

4. Strong knowledge of Unit testing, code quality improvement

5. Cloud based deployment and integration of applications/micro services.

6. Experience with NoSQL databases, such as MongoDB, Cassandra

7. Strong applied statistics skills

8. Knowledge of creating CI/CD pipelines and touchless deployment.

9. Knowledge about API, Data Engineering techniques.

10. AWS

11. Knowledge of Machine Learning and Large Language Model.


Nice to Have

1. Exposure to financial research domain

2. Experience with JIRA, Confluence

3. Understanding of scrum and Agile methodologies

4. Experience with data visualization tools, such as Grafana, GGplot, etc

Read more
Mumbai
5 - 14 yrs
₹50L - ₹70L / yr
skill iconMachine Learning (ML)
skill iconData Science
Natural Language Processing (NLP)
Computer Vision
kubeflow
+8 more

Responsibilities:

  • Data science model review, run the code refactoring and optimization, containerization, deployment, versioning, and monitoring of its quality.
  • Design and implement cloud solutions, build MLOps on the cloud (preferably AWS)
  • Work with workflow orchestration tools like Kubeflow, Airflow, Argo, or similar tools
  • Data science models testing, validation, and test automation.
  • Communicate with a team of data scientists, data engineers, and architects, and document the processes.


Eligibility:

  • Rich hands-on experience in writing object-oriented code using python
  • Min 3 years of MLOps experience (Including model versioning, model and data lineage, monitoring, model hosting and deployment, scalability, orchestration, continuous learning, and Automated pipelines)
  • Understanding of Data Structures, Data Systems, and software architecture
  • Experience in using MLOps frameworks like Kubeflow, MLFlow, and Airflow Pipelines for building, deploying, and managing multi-step ML workflows based on Docker containers and Kubernetes.
  • Exposure to deep learning approaches and modeling frameworks (PyTorch, Tensorflow, Keras, etc. )
Read more
CoreStack
at CoreStack
2 recruiters
Maria Godslin
Posted by Maria Godslin
Chennai
3 - 6 yrs
₹3L - ₹7L / yr
API
SoapUI
Selenium
skill iconAmazon Web Services (AWS)
Windows Azure
+2 more

CoreStack, an AI-powered multi-cloud governance solution, empowers enterprises to unleash the power of the cloud on their terms by helping them rapidly achieve continuous and autonomous cloud governance at scale. CoreStack enables enterprises to realize outcomes across FinOps, SecOps and CloudOps such as a 40% decrease in cloud costs and a 50% increase in operational efficiencies by governing operations, security, cost, access, and resources. CoreStack also assures 100% compliance with standards such as ISO, FedRAMP, NIST, HIPAA, PCI-DSS, AWS CIS & Well-Architected Framework (WAF). CoreStack works with many large global customers across multiple industries including Financial Services, Healthcare, Retail, Education, Telecommunications, Technology and Government. 

The company is backed by industry-leading venture investors. CoreStack is a recent recipient of the 2021 Gold Stevie American Business Awards in the Cloud Infrastructure category and the 2021 Gold Globee Winner of the Most Innovative Company of the Year in IT Cloud/SaaS. In addition, CoreStack won the 2021 Best New Products American Business Award in Cloud Governance as well as Golden Bridge Awards for Cloud Computing/SaaS Innovation and Cloud Security Innovation. CoreStack was recognized as IDC Innovator in Cloud Management Solutions and in the Gartner Magic Quadrant for Cloud Management Platforms in 2020. The Company is a three-time TiE50 Winner and an Emerge 50 League-10 NASSCOM award recipient in Enterprise Software. CoreStack is a Google Cloud Build Partner, Microsoft Azure Gold & Co-Sell Partner, and Amazon AWS Advanced Technology Competency Partner.

 

Responsibilities:

  • Part of a product team responsible for the Quality of the product used by Global customers
  • Drive the automation efforts to reduce the time taken to identify issues
  • Contribute towards the process improvement involving the entire Product development life cycle.
  • Should be able to capture the user behavior and identify the gaps in UX flows. 
  • Responsible for the performance and usability of the Product.

Requirements

  • Minimum 4+ Years of experience in testing domain.
  • Should be well versed with Software Testing Methodologies and Techniques
  • Should have experience in AWS or Azure or Google Cloud
  • Should have experience in Manual Testing and identifying the critical scenario.
  • Should have experience in Data Base testing
  • Should have experience in Test Management tools.
  • Should have experience in performance testing, load testing and basic knowledge on security testing.
  • API testing using Soap UI is an added advantage.
  • Selenium with Java is an added advantage.
  • CI & CD using Jenkins is an added advantage.
  • Prior work experience with Cloud domain and cloud based products is a Plus

CoreStack Offers

  • Competitive salary
  • Competitive benefit package with appreciable equity
  • Exciting, fast-paced and entrepreneurial culture
  • Health insurance and other company benefits
Read more
Bengaluru (Bangalore)
4 - 8 yrs
₹20L - ₹25L / yr
Relational Database (RDBMS)
skill iconPostgreSQL
MySQL
skill iconPython
Spark
+6 more

What is the role?

You will be responsible for developing and designing front-end web architecture, ensuring the responsiveness of applications, and working alongside graphic designers for web design features, among other duties. You will be responsible for the functional/technical track of the project

Key Responsibilities

  • Develop and automate large-scale, high-performance data processing systems (batch and/or streaming).
  • Build high-quality software engineering practices towards building data infrastructure and pipelines at scale.
  • Lead data engineering projects to ensure pipelines are reliable, efficient, testable, & maintainable
  • Optimize performance to meet high throughput and scale

What are we looking for?

  • 4+ years of relevant industry experience.
  • Working with data at the terabyte scale.
  • Experience designing, building and operating robust distributed systems.
  • Experience designing and deploying high throughput and low latency systems with reliable monitoring and logging practices.
  • Building and leading teams.
  • Working knowledge of relational databases like Postgresql/MySQL.
  • Experience with Python / Spark / Kafka / Celery
  • Experience working with OLTP and OLAP systems
  • Excellent communication skills, both written and verbal.
  • Experience working in cloud e.g., AWS, Azure or GCP

Whom will you work with?

You will work with a top-notch tech team, working closely with the architect and engineering head.

What can you look for?

A wholesome opportunity in a fast-paced environment that will enable you to juggle between concepts, yet maintain the quality of content, interact and share your ideas and have loads of learning while at work. Work with a team of highly talented young professionals and enjoy the benefits of being at this company

We are

We  strive to make selling fun with our SaaS incentive gamification product.  Company  is the #1 gamification software that automates and digitizes Sales Contests and Commission Programs. With game-like elements, rewards, recognitions, and complete access to relevant information, Company turbocharges an entire salesforce. Company also empowers Sales Managers with easy-to-publish game templates, leaderboards, and analytics to help accelerate performances and sustain growth.

We are a fun and high-energy team, with people from diverse backgrounds - united under the passion of getting things done. Rest assured that you shall get complete autonomy in your tasks and ample opportunities to develop your strengths.

Way forward

If you find this role exciting and want to join us in Bangalore, India, then apply by clicking below. Provide your details and upload your resume. All received resumes will be screened, shortlisted candidates will be requested to join for a discussion and on mutual alignment and agreement, we will proceed with hiring.

 
Read more
TeamExtn
at TeamExtn
3 recruiters
Jhalak Doshi
Posted by Jhalak Doshi
Remote, Mumbai
4 - 6 yrs
₹1L - ₹15L / yr
skill iconScala
skill iconJava
Akka
MySQL
Google Cloud Platform (GCP)
+2 more

Job Description:

TeamExtn is looking for a passionate Senior Scala Engineer. It will be expected from you to build pragmatic solutions on mission-critical initiatives. If you know your stuff, see the beauty in code, have knowledge in depth and breadth, advocate best practices, and love to work with distributed systems, then this is an ideal position for you.

As a core member of our Special Projects team, you will work on various new projects in a startup-like environment. These projects may include such things as building new APIs (REST/GraphQL/gRPC) for new products, integrating new products with core Carvana services, building highly scalable back end processing systems in Scala and integrating with systems backed by Machine Learning. You will use cutting edge functional Scala libraries such as ZIO. You will have the opportunity to work closely with our Product, Experience and Software Engineering teams to deliver impact.

Responsibilities:

  • Build highly scalable APIs and back end processing systems for new products
  • Contribute in the full software development lifecycle from design and development to testing and operating in production
  • Communicate effectively with engineers, product managers and data scientists
  • Drive scalability and performance within our distributed AI platform
  • Full software development lifecycle from design and development to testing and operating in production
  • Communicate effectively with engineers, product managers and data scientists

Skills And Experience:

  • 4+ years experience with Scala, Java or other functional language
  • Experience with Akka and Lightbend stack
  • Expert with PostgreSQL, MySQL or MS SQL
  • Experience in architecting, developing, deploying and operating large scale distributed systems and actor systems
  • Experience with cloud APIs (e.g., GCP, AWS, Azure)
  • Messaging systems such as GCP Pub/Sub, RabbitMQ, Kafka
  • Strong foundation in algorithms and data structures and their real-world use cases.
  • Solid understanding of computer systems and networks
  • Production quality coding standards and patterns

 

BONUS SKILLS:

  • Experience with functional programming in Scala
  • Knowledge of ZIO and related ecosystem
  • Experience with functional database libraries in Scala (Quill preferred)
  • Kubernetes and Docker
  • Elasticsearch
  • Typescript, React and frontend UI development experience
  • gRPC, GraphQL
Read more
The Solar Labs
at The Solar Labs
3 recruiters
Ruchika Prakash
Posted by Ruchika Prakash
Bengaluru (Bangalore)
2 - 3 yrs
₹12L - ₹14L / yr
ThreeJs (Three.js)
Unity
skill iconJava
Data Structures
Algorithms
+6 more
About the Company :

The Solar Labs was founded by IIT alumni in 2017 to accelerate solar adoption in the world. Our products empower the solar industry to help it succeed. We develop software that helps solar installers and developers in designing more optimized solar PV systems, increase energy yield per panel installed, reduce cost of installations and create quotations and reports for clients within 20 minutes. The software has been used to estimate 1200 MW+ of solar capacity across India and serves some of the largest companies in the world including Tata Power, Adani Solar, Renew Power and hundreds of MSMEs.

When we succeed, the solar industry wins, and the world wins.

About the Product :
It's a 3d simulation software, to replicate rooftops/commercial sites, place solar panels and generate the estimation of solar energy.

Roles And responsibilities :
- To find out the features offered by other similar software and to collaborate with developers on implementing the same.
- Develop new features and ideas to make product better and user centric.
- Continuous look-out for new and creative solutions to implement new features or improve old ones.
- Create algos from scratch and implement them in the software.

Who can apply?
- 1-2 years of experience in any 3d software development (three js, unity, unreal, openGL)
- Strong Data structure and Algorithm knowledge
- Strong Aptitude and Reasoning
- Good understanding of mathematical formulas.
- Although we are not language centric but knowledge of javascript is preferred (specially 3D javascript libraries)
- Strong drive to learn new technologies as we are constantly evolving
- Cloud experience with AWS/Google Cloud is a big plus.
Read more
Recko
at Recko
1 recruiter
Agency job
via Zyoin Web Private Limited by Chandrakala M
Bengaluru (Bangalore)
3 - 7 yrs
₹16L - ₹40L / yr
Big Data
Hadoop
Spark
Apache Hive
Data engineering
+6 more

Recko Inc. is looking for data engineers to join our kick-ass engineering team. We are looking for smart, dynamic individuals to connect all the pieces of the data ecosystem.

 

What are we looking for:

  1. 3+  years of development experience in at least one of MySQL, Oracle, PostgreSQL or MSSQL and experience in working with Big Data technologies like Big Data frameworks/platforms/data stores like Hadoop, HDFS, Spark, Oozie, Hue, EMR, Scala, Hive, Glue, Kerberos etc.

  2. Strong experience setting up data warehouses, data modeling, data wrangling and dataflow architecture on the cloud

  3. 2+ experience with public cloud services such as AWS, Azure, or GCP and languages like Java/ Python etc

  4. 2+ years of development experience in Amazon Redshift, Google Bigquery or Azure data warehouse platforms preferred

  5. Knowledge of statistical analysis tools like R, SAS etc 

  6. Familiarity with any data visualization software

  7. A growth mindset and passionate about building things from the ground up and most importantly, you should be fun to work with

As a data engineer at Recko, you will:

  1. Create and maintain optimal data pipeline architecture,

  2. Assemble large, complex data sets that meet functional / non-functional business requirements.

  3. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

  4. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

  5. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

  6. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

  7. Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

  8. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

  9. Work with data and analytics experts to strive for greater functionality in our data systems.

 

About Recko: 

Recko was founded in 2017 to organise the world’s transactional information and provide intelligent applications to finance and product teams to make sense of the vast amount of data available. With the proliferation of digital transactions over the past two decades, Enterprises, Banks and Financial institutions are finding it difficult to keep a track on the money flowing across their systems. With the Recko Platform, businesses can build, integrate and adapt innovative and complex financial use cases within the organization and  across external payment ecosystems with agility, confidence and at scale.  . Today, customer-obsessed brands such as Deliveroo, Meesho, Grofers, Dunzo, Acommerce, etc use Recko so their finance teams can optimize resources with automation and prioritize growth over repetitive and time-consuming tasks around day-to-day operations. 

 

Recko is a Series A funded startup, backed by marquee investors like Vertex Ventures, Prime Venture Partners and Locus Ventures. Traditionally enterprise software is always built around functionality. We believe software is an extension of one’s capability, and it should be delightful and fun to use.

 

Working at Recko: 

We believe that great companies are built by amazing people. At Recko, We are a group of young Engineers, Product Managers, Analysts and Business folks who are on a mission to bring consumer tech DNA to enterprise fintech applications. The current team at Recko is 60+ members strong with stellar experience across fintech, e-commerce, digital domains at companies like Flipkart, PhonePe, Ola Money, Belong, Razorpay, Grofers, Jio, Oracle etc. We are growing aggressively across verticals.

Read more
na
at na
Agency job
via Jobdost by Heena K
Bengaluru (Bangalore)
3 - 6 yrs
₹8L - ₹25L / yr
DBA
SQL server
SQL
Database Design
skill iconAmazon Web Services (AWS)
+3 more

 

Job Title:

Senior Cloud Service DBA

Department & Team

Technology

Location:

INDIA

Reporting To:

Senior DBA – Team Lead

 

Role Purpose:

 

This role will be a mix of project, BAU and innovation and will encompass typical database administration duties that you may expect in a large-scale telecommunications environment: performance tuning, replication, database design, backups, high availability, encryption, security, configuration etc.


The Client DBA Team is responsible for the delivery, maintenance and support of all Database platforms used by Clients twenty-four hours a day, seven days a week. The team’s primary function is to ensure all relational and non-relational databases are optimised for performance, designed with agreed levels of fault tolerance.

 

1. Network Services Team – Responsible for IP Network and its associated components

2. Infrastructure Team – Responsible for Server and Storage systems

3. Database Services Team – Responsible for all Databases

4. Cloud Architect Team – Delivering future strategy, ongoing cloud performance optimisation.

 

The DBA function forms part of Client’s Tier 3 support function and works closely with the internal NOC, Service Desk, Infrastructure, IP Networks and Cloud Architect teams. To enable the business in achieving its stated objectives by assisting the other technology teams to achieve world-class benchmarks of customer service and support.

 

To highlight, a key requirement of the role will be involvement in defining our future strategy around database modernisation in the cloud.

 

Responsibilities:

 

Operations

· Involved in new solutions design discussions and to recommend suitable, secure, performance optimised database offerings based on business requirements

· Ensure all databases in the AWS and Azure are configured for performance, scale and high availability where required

· Take responsibility of modernisation of Clients database estate in the cloud, leveraging open source technologies and cloud native hosting platforms

· Drive innovation by constantly reviewing latest public cloud platform database service releases such as Babelfish in AWS to fast track adoption of native services

· Ensure security considerations are at the forefront when designing and managing database solutions

· Optimise query performance

· Ensure all key databases have deep insight monitoring enabled to enable improved capabilities around fault detection

· Perform regular database maintenance when required and ensure databases are backed up according to agreed RTO / RPO information

· Maintenance work to be planned meticulously to minimise/eradicate self-inflicted P1 outages

· Monitoring database costs regularly and identify strategies to minimize cost as part of internal FinOps practices

· Ability to provide technical system solutions, determine overall design direction and provide hardware recommendations for complex technical issues

· Provisioning, deployment, monitoring cloud environment using automation tools like Terraform

Skills & Experience:

Certifications:

· SQL Server Database Administration - REQUIRED

· AWS Certified Solutions Architect Associate – Highly Desirable

· AWS Certified Database Specialty – REQUIRED

· Azure Database Administrator Associate – Highly Desirable

 

Skills & Experience:

· Ideal candidate has been supporting traditional server based relational databases for over 8 years who then transitioned into AWS and Azure public cloud for the last 5 years

· SQL Server / MSSQL 2008 / 2012 / 2014 / 2016 / 2017 / 2019 (including Always-On and Analysis Services)

· Postgres / MYSQL as standalone and managed service platforms

· Strong database migration experience (Particularly MSSQL to open source and leveraging AWS native platforms including RDS, Athena, Aurora)

· Extensive AWS experience in a commercial environment, architecting database best practices

· Strong experience supporting AWS / Azure based datalake/data warehouse environments. Required to support internal BI teams

· Solid experience and understanding which workloads are best suitable for which specific database platforms in AWS and Azure

· Extensive experience and understanding of database security, including appropriate encryption and authentication best practices

· Good working knowledge of Microsoft Power BI

· Good knowledge of Azure cloud database services

· Any working experience around non-relational databases (internally hosted or managed service such as DynamoDB in AWS will be favoured)

· Good working knowledge of Windows and Linux Server Operating Systems

· Excellent presentation skills to both an internal and external audience

· The ability to share and communicate your specific expertise to the rest of the Technology group

 

Behavioural Fit:

 

· Professional appearance and manner

· High personal drive; results-oriented; make things happen; “can-do attitude”

· Can work and adapt within a highly dynamic and growing environment

· Team Player; effective at building close working relationships with others

· Effectively manages diversity within the workplace

· Strong focus on service delivery and the needs and satisfaction of internal clients

· Able to see issues from a global, regional and corporate perspective

· Able to effectively plan and manage large projects

· Excellent communication skills and interpersonal skills at all levels

· Strong analytical, presentation and training skills

· Innovative and creative

· Visionary and strategic view of technology enablers (creative and innovative)

· High verbal and written communication ability, able to influence effectively at all levels

· Possesses technical expertise and knowledge to lead by example and input into technical debates

· Depth and breadth of experience in infrastructure technologies

· Enterprise mentality and a global mindset

· Sense of humour

 

Role Key Performance Indicators:

 

· Design and deliver repeatable, best in class, cloud database solutions

· Pro-actively monitor service quality and take action to scale operational services, in line with business growth

· Generate operating efficiencies, to be agreed with Infrastructure Services Manager

· Establish a “best in sector” level of operational service delivery and insight

· Help creates an effective team

· Continuous improvement

 

 

 

Read more
Saviance Technologies
at Saviance Technologies
1 recruiter
Shipra Agrawal
Posted by Shipra Agrawal
NCR (Delhi | Gurgaon | Noida)
3 - 5 yrs
₹7L - ₹9L / yr
PowerBI
power bi
Business Intelligence (BI)
DAX
Data modeling
+3 more

 

Job Title: Power BI Developer(Onsite)

Location: Park Centra, Sec 30, Gurgaon

CTC:        8 LPA

Time:       1:00 PM - 10:00 PM

  

Must Have Skills: 

  • Power BI Desktop Software
  • Dax Queries
  • Data modeling
  • Row-level security
  • Visualizations
  • Data Transformations and filtering
  • SSAS and SQL

 

Job description:

 

We are looking for a PBI Analytics Lead responsible for efficient Data Visualization/ DAX Queries and Data Modeling. The candidate will work on creating complex Power BI reports. He will be involved in creating complex M, Dax Queries and working on data modeling, Row-level security, Visualizations, Data Transformations, and filtering. He will be closely working with the client team to provide solutions and suggestions on Power BI.

 

Roles and Responsibilities:

 

  • Accurate, intuitive, and aesthetic Visual Display of Quantitative Information: We generate data, information, and insights through our business, product, brand, research, and talent teams. You would assist in transforming this data into visualizations that represent easy-to-consume visual summaries, dashboards and storyboards. Every graph tells a story.
  • Understanding Data: You would be performing and documenting data analysis, data validation, and data mapping/design. You would be mining large datasets to determine its characteristics and select appropriate visualizations.
  • Project Owner: You would develop, maintain, and manage advanced reporting, analytics, dashboards and other BI solutions, and would be continuously reviewing and improving existing systems and collaborating with teams to integrate new systems. You would also contribute to the overall data analytics strategy by knowledge sharing and mentoring end users.
  • Perform ongoing maintenance & production of Management dashboards, data flows, and automated reporting.
  • Manage upstream and downstream impact of all changes on automated reporting/dashboards
  • Independently apply problem-solving ability to identify meaningful insights to business
  • Identify automation opportunities and work with a wide range of stakeholders to implement the same.
  • The ability and self-confidence to work independently and increase the scope of the service line

 

Requirements: 

  • 3+ years of work experience as an Analytics Lead / Senior Analyst / Sr. PBI Developer.
  • Sound understanding and knowledge of PBI Visualization and Data Modeling with DAX queries
  • Experience in leading and mentoring a small team.

 

 

 

Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos