Power Business Intelligent Developer

at N H Enterprises

DP
Posted by Yogita Purandare
icon
Pune, Kharadi
icon
2 - 5 yrs
icon
₹4L - ₹7L / yr
icon
Full time
Skills
Exp. in dashboard
Should have exp.in visualization
Working knowledge on Microsoft power pivot tables
working exp. in BI Desktop
Big Data
Should be able to create AWesome Dashboards - Should have hands on knowledge on all of the following: > Visualizations > Datasets > Reports > Dashboards > Tiles - Excellent Querying Skills using TSQL. - Should have prior exposure to SSRS and/or SSAS - Working knowledge of Microsoft Power Pivot, Power View, and Power BI Desktop.
Read more

About N H Enterprises

Founded in 2014, N H Enterprises is a bootstrapped company based in India. It has 1-5 employees currently and works in the domain of IT Consultancy.
Read more
Founded
2014
Type
Services
Size
0-20 employees
Stage
Bootstrapped
View full company details
Why apply to jobs via Cutshort
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
2101133
Matches delivered
3712187
Network size
15000
Companies hiring

Similar jobs

Analyst (Research)

at Kwalee

Founded 2011  •  Product  •  100-500 employees  •  Profitable
Market Research
Big Data
icon
Bengaluru (Bangalore)
icon
2 - 10 yrs
icon
Best in industry

Kwalee is one of the world’s leading multiplatform game publishers and developers, with well over 750 million downloads worldwide for mobile hits such as Draw It, Teacher Simulator, Let’s Be Cops 3D, Traffic Cop 3D and Makeover Studio 3D. Alongside this, we also have a growing PC and Console team of incredible pedigree that is on the hunt for great new titles to join TENS!, Eternal Hope and Die by the Blade. 

With a team of talented people collaborating daily between our studios in Leamington Spa, Bangalore and Beijing, or on a remote basis from Turkey, Brazil, the Philippines and many more places, we have a truly global team making games for a global audience. And it’s paying off: Kwalee games have been downloaded in every country on earth! If you think you’re a good fit for one of our remote vacancies, we want to hear from you wherever you are based.

Founded in 2011 by David Darling CBE, a key architect of the UK games industry who previously co-founded and led Codemasters for many years, our team also includes legends such as Andrew Graham (creator of Micro Machines series) and Jason Falcus (programmer of classics including NBA Jam) alongside a growing and diverse team of global gaming experts. Everyone contributes creatively to Kwalee’s success, with all employees eligible to pitch their own game ideas on Creative Wednesdays, and we’re proud to have built our success on this inclusive principle. Could your idea be the next global hit?

What’s the job?

As an Analyst (Research) in the Mobile Publishing division you’ll be using your previous experience in analysing market trends to pull usable insights from numerous sources, and find trends others might miss.

What you tell your friends you do 

“I provide insights that help guide the direction of Kwalee’s mobile publishing team as they expand their operation”

What you will really be doing 

  • Using our internal and external data sources to generate insights.
  • Assess market trends and make recommendations to our publishing team on which opportunities to pursue and which to decline
  • Evaluate market movements and use data to assess new opportunities
  • Create frameworks to predict how successful new content can be and the metrics games are likely to achieve
  • Evaluate business opportunities and conduct due diligence on potential business partners we are planning to work with
  • Be an expert on industry data sets and how we can best use them

How you will be doing this

  • You’ll be part of an agile, multidisciplinary and creative team and work closely with them to ensure the best results.
  • You'll think creatively and be motivated by challenges and constantly striving for the best.
  • You’ll work with cutting edge technology, if you need software or hardware to get the job done efficiently, you will get it. We even have a robot!

Team

Our talented team is our signature. We have a highly creative atmosphere with more than 200 staff where you’ll have the opportunity to contribute daily to important decisions. You’ll work within an extremely experienced, passionate and diverse team, including David Darling and the creator of the Micro Machines video games.

Skills and Requirements

  • Previous experience of working with big data sets, preferably in a gaming or tech environment
  • An advanced degree in a related field
  • A keen interest in video games and the market, particularly in the mobile space
  • Familiarity with industry tools and data providers
  • A can-do attitude and ability to move projects forward even when outcomes may not be clear 

We offer

  • We want everyone involved in our games to share our success, that’s why we have a generous team profit sharing scheme from day 1 of employment
  • In addition to a competitive salary we also offer private medical cover and life assurance
  • Creative Wednesdays!(Design and make your own games every Wednesday)
  • 20 days of paid holidays plus bank holidays 
  • Hybrid model available depending on the department and the role
  • Relocation support available 
  • Great work-life balance with flexible working hours
  • Quarterly team building days - work hard, play hard!
  • Monthly employee awards
  • Free snacks, fruit and drinks

Our philosophy

We firmly believe in creativity and innovation and that a fundamental requirement for a successful and happy company is having the right mix of individuals. With the right people in the right environment anything and everything is possible.

Kwalee makes games to bring people, their stories, and their interests together. As an employer, we’re dedicated to making sure that everyone can thrive within our team by welcoming and supporting people of all ages, races, colours, beliefs, sexual orientations, genders and circumstances. With the inclusion of diverse voices in our teams, we bring plenty to the table that’s fresh, fun and exciting; it makes for a better environment and helps us to create better games for everyone! This is how we move forward as a company – because these voices are the difference that make all the difference.

Read more
Job posted by
Michael Hoppitt

Senior Software Engineer

at Hiring for a leading client

Agency job
via Jobaaj.com
Big Data
Apache Kafka
Business Intelligence (BI)
Data Warehouse (DWH)
Coding
Hadoop
Apache Impala
Spark
Python
CI/CD
Git
Tableau
Qlikview
Amazon Web Services (AWS)
Java
Apache Oozie
Data engineering
Databases
Software Development
Airflow
icon
New Delhi
icon
3 - 5 yrs
icon
₹10L - ₹15L / yr
Job Description:
Senior Software Engineer - Data Team

We are seeking a highly motivated Senior Software Engineer with hands-on experience and build scalable, extensible data solutions, identifying and addressing performance bottlenecks, collaborating with other team members, and implementing best practices for data engineering. Our engineering process is fully agile, and has a really fast release cycle - which keeps our environment very energetic and fun.

What you'll do:

Design and development of scalable applications.
Work with Product Management teams to get maximum value out of existing data.
Contribute to continual improvement by suggesting improvements to the software system.
Ensure high scalability and performance
You will advocate for good, clean, well documented and performing code; follow standards and best practices.
We'd love for you to have:

Education: Bachelor/Master Degree in Computer Science.
Experience: 3-5 years of relevant experience in BI/DW with hands-on coding experience.

Mandatory Skills

Strong in problem-solving
Strong experience with Big Data technologies, Hive, Hadoop, Impala, Hbase, Kafka, Spark
Strong experience with orchestration framework like Apache oozie, Airflow
Strong experience of Data Engineering
Strong experience with Database and Data Warehousing technologies and ability to understand complex design, system architecture
Experience with the full software development lifecycle, design, develop, review, debug, document, and deliver (especially in a multi-location organization)
Good knowledge of Java
Desired Skills

Experience with Python
Experience with reporting tools like Tableau, QlikView
Experience of Git and CI-CD pipeline
Awareness of cloud platform ex:- AWS
Excellent communication skills with team members, Business owners, across teams
Be able to work in a challenging, dynamic environment and meet tight deadlines
Read more
Job posted by
Saksham Agarwal

Principal Data Engineer

at SkyPoint Cloud

Founded 2019  •  Product  •  20-100 employees  •  Profitable
Python
PySpark
Spark
Microsoft Windows Azure
Data engineering
Big Data
Microsoft SQL Server
RESTful APIs
NOSQL Databases
Apache Commons
databricks
icon
Bengaluru (Bangalore)
icon
7 - 14 yrs
icon
Best in industry
Whom we want:
 
SkyPoint is looking for ambitious, independent engineers who want to have a big impact at a fast-growing company. You will work on our core data pipeline and the integrations that bring in data from many sources we support. We are looking for people who can understand the key values that make our product great and implement those values in the many small decisions you make every day as a developer
 
What you do:
 
As a Principal Data Engineering at SkyPoint:
 
You will be working with Python, PySpark, Spark (Azure Databricks), VS Code, REST APIs, Azure Durable Functions, Cosmos DB, Serverless, and Kubernetes container-based microservices and interacting with various Delta Lakehouse and NoSQL databases.
You will process the data into clean, unified, incremental, automated updates via Azure Durable Functions, Azure Data Factory, Delta Lake, and Spark.
Having Managerial experience in taking ownership of the product and leading the team.
 
Primary Duties & Responsibilities:
 
·       Making high-level estimation with a hypothesis and critical points.
·       Delivering components aligned with scope, budget, and planning committed with the business owner.
·       Design roadmap and follow progress with identification of critical points and risks.
·       Experience working with languages like Python, and Go and technologies such as serverless and containers.
·       Strong technical and problem-solving skills, with recent hands-on in Azure Machine Learning are good to have.
·       Experience in reliable distributed systems, with an emphasis on high-volume data management within the enterprise and/or web-scale products and platforms that operate under strict SLAs.
·       Broad technical knowledge which encompasses Software development and automation.
·       Experience with the use of a wide array of algorithms and data structures.
·       Expertise in working with Azure Functions, Azure Data Lake, Azure Data Factory, Azure Databricks/Spark, Azure DevOps, PySpark, Scikit-learn, TensorFlow, Keras, PyTorch
·       Best practices in design and programming.
·       Entrepreneurial mindset, excellent communication, and technical leadership skills.
·       Create and contribute to an environment that is geared to innovation, high productivity, high quality, and customer service.
 
 
Skills & Experience Require:
 
·       The ideal candidate will be an enthusiastic leader with building experience in a professional environment, and overall 6+ years of hands-on technical leadership quality. Possesses at least 2+ years in a leadership role.
·       Bachelor’s/Master’s degree, preferably in Software Engineering or Computer Science and from a reputed institution.
·       Planning and executing strategies for completing projects on time.
·       Researching and developing designs and products.
·       Ensuring products have the support of upper management.
·       Providing clear and concise instructions to the team.
·       Implementing and providing tools for non-regression tests automation.
·       Generating and reviewing documentation for all database changes or refinements.
·       Managing the entire module and having team-leading experience.
·       Making recommendations for software, hardware, and data storage upgrades.
·       Communicating with your team members and staff to effectively understand and interpret data changes or requirements.
·       Most recent work experience MUST include working on Python, Spark(Azure Databricks), Azure Durable Functions, Cosmos DB, Azure Data Factory, Delta Lakehouse, PySpark, NoSQL DB, Serverless, and Kubernetes container-based microservices
·       Excellent verbal and written communication skills.
Read more
Job posted by
Sheetal B

Technical Content Engineer

at Loonycorn

Founded 2015  •  Products & Services  •  20-100 employees  •  Profitable
Data Structures
Big Data
Cloud Computing
Kubernetes
Java
Databases
icon
Bengaluru (Bangalore)
icon
0 - 7 yrs
icon
₹3L - ₹5L / yr
About Loonycorn:

Founded by Google, Stanford, and Columbia alumni, Loonycorn is a leading studio for e-learning content on machine learning, cloud computing, blockchain, and other emerging technologies.

About the Role:

We are looking for folks to build software projects which will be part of technical content similar to what you'd find at the links below:

https://www.pluralsight.com/search?q=janani+ravi
https://www.pluralsight.com/search?q=vitthal+srinivasan
https://www.udemy.com/u/janani-ravi-2/

This involves:
- learning a new technology from scratch
- coding real-world projects which use that technology (you will be a software engineer writing code, not making slides or talking about code)

What is important to us:
- Grit - Perseverance in working on hard problems. Technical video-making is difficult and detail-oriented (that's why it is a highly profitable business)
- Craftsmanship - Our video-making is quite artisanal - lots of hard work and small details. There are many excellent roles where doing smart 80-20 trade-offs is the way to succeed - this is not one of them.
- Clarity - Talking and thinking in direct, clear ways is super-important in what we do. Folks who use a lot of jargon or cliches, or work on writing code without understanding what goes on underneath will not be a fit
- Creativity - Analogies, technical metaphors, and other artistic elements are an important part of what we do.

What is not all that important to us:
- Your school or labels: Perfectly fine whatever college or company you are applying from
- English vocabulary or pronunciation: You don't need to 'talk well' or be flashy to build good projects
Read more
Job posted by
Vitthal Srinivasan

Data Engineer

at Easebuzz

Founded 2016  •  Product  •  100-500 employees  •  Raised funding
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
Data Analytics
Apache Kafka
SQL
Amazon Web Services (AWS)
Big Data
DynamoDB
MongoDB
EMR
Amazon Redshift
ETL
Data architecture
Data modeling
icon
Pune
icon
2 - 4 yrs
icon
₹2L - ₹20L / yr

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
Job posted by
Amala Baby

SOFTWARE ENGINEER – Java, Scala, Big Data

at CNH Industrial

Founded 1900  •  Product  •  5000+ employees  •  Profitable
Big Data
Scala
Spring Boot
Java
icon
Gurugram
icon
3 - 8 yrs
icon
₹15L - ₹30L / yr

Job Description:

 

As a Software Engineer – Java, Scala, Big Data you will join a highly skilled software team in delivering innovative mobile and web applications that make up CNH Industrial’s next generation digital platform. The digital platform will enable products that integrate with connected CNH Industrial tractors, sprayers and combines and enable wide range of farm management capabilities.

 

This is an excellent opportunity to join the technology revolution currently taking place across the agricultural industry and work with highly skilled and talented people in a global, diversified company.

 

Primary responsibilities include working closely with product management, UX designers and backend developers on the design, development, testing and deployment of our next generation applications and existing product lines. You will undertake all assigned tasks and responsibilities effectively and professionally in accordance with company, team and customer expectations.

 

Essential Duties and Responsibilities

 

  1. Work in a team or individually to design, develop and test software for cloud, web and mobile
  2. Design, develop, test and document quality software to user and functional requirements within specified timeframes and in accordance with CNHI coding standards

3.    Generate rapid prototypes for feasibility testing

4.    Generate all documentation relevant to software operation

5.    Adhere to prescribed development systems, processes procedures and ensure efficient, effective, high quality delivery

6.    Communicate effectively with all stakeholders

  1. Perform tasks as specified by the Delivery Lead/Team Lead

8.    Other related duties as required

 

Competencies

 

Qualifications

To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill and/or ability required.

 

Education Qualifications and/or Experience

§ Bachelor's degree in Computer Science or Computer Engineering from an accredited university

§ 3+ years of industry experience

§ 3+ years of Java Spring Boot server application design and testing experience

§ Experience with writing server applications, working with large amount of data, Big Data processing – batch and streaming

§ Strong fundamentals on OOP, RESTful architectures, Design Patterns, Data Structures, Algorithms

§ Experience with RESTful API development

§ Experience with Microservices Development; working on Docker, Kubernetes, Helm/Terrform

§ Experience with Microsoft Azure and cloud services including exposure to PAAS services like service bus, event hub, blob stores, key vaults, API managers, Function Apps (serverless)

§ Experience with concurrency topics, asynchronous programming

 

Computer Skills

 

§ Java (Spring, Spring Boot, Hibernate) ; Scala

§ Microservices, Web Services,

§ OAuth 2.0 (JWT), Swagger, Postman, Open API Specification

§ Relational (SQL Server / Postgres); NoSQL (CosmosDB / MongoDB )

§ Big Data/Geospatial (HBase 2.1.6 (HDI 4.0, Geomesa 3.0.0)

§ Batch and streams data processing : Apache Kafka, Apache Flink, Apache Spark, Azure Service Bus

§ Caching: Redis

§ Good working knowledge of CI/CD environments (preferably Azure DevOps), Git or similar configuration management software; Build Automation (Maven)

§ Knowledge of Testing Tools such as Jasmine, Cypress, NUnit, xUnit, Junit, Testcontainers

Read more
Job posted by
Prabhat Mishra

Principal Engineer - Hadoop

at US Based Product Organization

Agency job
via e-Hireo
Hadoop
HDFS
Apache Hive
Zookeeper
Cloudera
SQL
HQL
Hortonworks
Big Data
LDAP
Linux/Unix
Amazon Web Services (AWS)
SSL
icon
Bengaluru (Bangalore)
icon
10 - 15 yrs
icon
₹25L - ₹45L / yr

Responsibilities :

  • Provide Support Services to our Gold & Enterprise customers using our flagship product suits. This may include assistance provided during the engineering and operations of distributed systems as well as responses for mission-critical systems and production customers.
  • Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
  • Lead and mentor others about concurrency, parallelization to deliver scalability, performance, and resource optimization in a multithreaded and distributed environment
  • Demonstrate the ability to actively listen to customers and show empathy to the customer’s business impact when they experience issues with our products


Requires Skills :

  • 10+ years of Experience with a highly scalable, distributed, multi-node environment (100+ nodes)
  • Hadoop operation including Zookeeper, HDFS, YARN, Hive, and related components like the Hive metastore, Cloudera Manager/Ambari, etc
  • Authentication and security configuration and tuning (KNOX, LDAP, Kerberos, SSL/TLS, second priority: SSO/OAuth/OIDC, Ranger/Sentry)
  • Java troubleshooting, e.g., collection and evaluation of jstacks, heap dumps
  • Linux, NFS, Windows, including application installation, scripting, basic command line
  • Docker and Kubernetes configuration and troubleshooting, including Helm charts, storage options, logging, and basic kubectl CLI
  • Experience working with scripting languages (Bash, PowerShell, Python)
  • Working knowledge of application, server, and network security management concepts
  • Familiarity with virtual machine technologies
  • Knowledge of databases like MySQL and PostgreSQL,
  • Certification on any of the leading Cloud providers (AWS, Azure, GCP ) and/or Kubernetes is a big plus
Read more
Job posted by
Biswajit Banik
Windows Azure
Hadoop
Spark
Data Structures
ADF
Big Data
icon
Bengaluru (Bangalore)
icon
3 - 9 yrs
icon
₹8L - ₹16L / yr
  • Working knowledge of setting up and running HD insight applications
  • Hands on experience in Spark, Scala & Hive
  • Hands on experience in ADF – Azure Data Factory
  • Hands on experience in Big Data & Hadoop ECO Systems
  •  Exposure to Azure Service categories like PaaS components and IaaS subscriptions
  • Ability to Design, Develop ingestion & processing frame work for ETL applications
  • Hands on experience in powershell scripting, deployment on Azure
  • Experience in performance tuning and memory configuration
  • Should be adaptable to learn & work on new technologies
  • Should have Communication Good written and spoken
Read more
Job posted by
Harpreet kour

Assistant Manager-Business Analaytics

at Kalibre global konnects

Founded 2014  •  Services  •  20-100 employees  •  Raised funding
Python
R Programming
Data Science
Data Analytics
Big Data
icon
Ahmedabad
icon
1 - 4 yrs
icon
₹3L - ₹5L / yr
Job Description:-
 Job Title:- Assistant Manager - Business Analytics
 Age: Max. 35years.
 Working Days:- 6 days a week
 Location:- Ahmedabad, Gujarat
 Monthly CTC:- Salary will commensurate with experience.
 Educational Qualification:- The candidate should have bachelor’s degree in
IT/Engineering from any recognized university..
 Experience:- 2+ years of work experience in AI/ML/business analytics with Institute of
repute or corporate.
Required Technical Skills:-
 A fair bit of understanding of Business Analytics, Data Science, Visualization/Big Data
etc.
 Basic knowledge of different analytical tools such as R Programming, Python etc.
 Hands on experience in Moodle Development (desirable).
 Good knowledge in customizing Moodle functionalities and developing custom themes
for Moodle (desirable).
 An analytical mind-set who enjoy helping participants solving problems and turning data
into useful actionable information
Key Responsibilities include:-
 Understand the tools and technologies specific to e-learning and blended learning
development and delivery.
 Provide academic as well as technical assistance to the faculty members teaching the
analytics courses.
 Working closely with the Instructors and assisting them in programming, coding, testing
etc.
 Preparing the lab study material in coordination with the Instructors and assisting
students in programming lab and solving their doubts.
 Works on assignments dealing with the routine and daily operation, use, and
configuration of the Learning Management System (LMS).
 Administers learning technology platforms including the creation of courses,
certifications and other e-learning programs on the platforms.
 Responsible to provide support within the eLearning department, provide technical
support to our external clients, and administrate the Learning Management System.
 Creates user groups and assigns content and assessments to the right target audience,
runs reports and creates learning evens in the LMS system.
 Performs regular maintenance of LMS database, including adding or removing courses.
 Uploads, tests, deploys and maintains all training materials/learning assets hosted in the
LMS.
 Ability to Multi-task.
 Ability to demonstrate accuracy on detailed oriented and repetitive job assignments.
 Responsible and reliable
Read more
Job posted by
Monika Sanghvi
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
Get to hear about interesting companies hiring right now
iconFollow Cutshort
Want to apply to this role at N H Enterprises?
Why apply via Cutshort?
Connect with actual hiring teams and get their fast response. No spam.
Learn more
Get to hear about interesting companies hiring right now
iconFollow Cutshort