Cutshort logo
N H Enterprises logo
Power Business Intelligent Developer
Power Business Intelligent Developer
N H Enterprises's logo

Power Business Intelligent Developer

Yogita Purandare's profile picture
Posted by Yogita Purandare
2 - 5 yrs
₹4L - ₹7L / yr
Pune, Kharadi
Skills
Exp. in dashboard
Should have exp.in visualization
Working knowledge on Microsoft power pivot tables
working exp. in BI Desktop
Big Data
Should be able to create AWesome Dashboards - Should have hands on knowledge on all of the following: > Visualizations > Datasets > Reports > Dashboards > Tiles - Excellent Querying Skills using TSQL. - Should have prior exposure to SSRS and/or SSAS - Working knowledge of Microsoft Power Pivot, Power View, and Power BI Desktop.
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About N H Enterprises

Founded :
2014
Type
Size :
0-20
Stage :
Bootstrapped
About
Founded in 2014, N H Enterprises is a bootstrapped company based in India. It has 1-5 employees currently and works in the domain of IT Consultancy.
Read more
Connect with the team
Profile picture
Yogita Purandare
Company social profiles
N/A

Similar jobs

xyz
at xyz
Agency job
via HR BIZ HUB by Pooja shankla
Bengaluru (Bangalore)
4 - 6 yrs
₹12L - ₹15L / yr
skill iconJava
Big Data
Apache Hive
Hadoop
Spark

Job Title Big Data Developer

Job Description

Bachelor's degree in Engineering or Computer Science or equivalent OR Master's in Computer Applications or equivalent.

Solid Experience of software development experience and leading teams of engineers and scrum teams.

4+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark).

Solid Datawarehousing concepts.

Knowledge of Financial reporting ecosystem will be a plus.

4+ years of experience within Data Engineering/ Data Warehousing using Big Data technologies will be an addon.

Expert on Distributed ecosystem.

Hands-on experience with programming using Core Java or Python/Scala

Expert on Hadoop and Spark Architecture and its working principle

Hands-on experience on writing and understanding complex SQL(Hive/PySpark-dataframes), optimizing joins while processing huge amount of data.

Experience in UNIX shell scripting.

Roles & Responsibilities

Ability to design and develop optimized Data pipelines for batch and real time data processing

Should have experience in analysis, design, development, testing, and implementation of system applications

Demonstrated ability to develop and document technical and functional specifications and analyze software and system processing flows.

Excellent technical and analytical aptitude

Good communication skills.

Excellent Project management skills.

Results driven Approach.

Mandatory SkillsBig Data, PySpark, Hive

Read more
Hexr Factory Immersive Tech
Fathariya Begam
Posted by Fathariya Begam
Chennai
0 - 2 yrs
₹2L - ₹5L / yr
skill iconC++
OpenCV
Computer Vision
skill iconGit
skill iconRuby
+3 more

Company Introduction


About Hexr Factory :

We are always exploring the possibilities to bridge the physical and digital worlds. We design and build Metaverse & Digital twin technologies for the future of industry and entertainment.


Job type & Location


Project Role: C++ Developer


Project Role Description :


The primary focus will be the development of all core engines, designing back-end components, integrating data storage, and ensuring high performance and responsiveness to requests from the front end. You will also be responsible for integrating the front-end elements built by your co-workers/Third-Party into the application.


Work Experience: 2 - 8 years


Work location: Chennai


Must-Have Skills: C++, OpenCV, Ruby, Boost C++ libraries, MySQL, MQTT. 


Key Responsibilities:

  • Extensive knowledge of C++ frameworks and libraries that utilize openCCTV, FFMPEG, video processing, and analytics.
  • Multi-threading programming, Distributed and Parallel computing, Big Data technologies, SQL Server programming with T-SQL, and Microsoft data access technologies.
  • Familiar with libraries like OpenCV, FFmpeg, GStreamer, and Directshow.
  • Extensive knowledge of RTSP, RTMP, and HLS video streaming protocols.
  • Candidate should know about release activities, source control, merging, and branching concepts.
  • Ability to analyze and visualize BIG data effectively.
  • Machine learning:- KNN, SVM, Text Search.
  • Familiar with QGIS.
  • A good understanding of Computer Vision and Image Processing concepts and algorithms.
  • Thorough knowledge of the standard library, STL containers, and algorithms.
  • Strong background in object-oriented design, prioritizing test ability and re-usability.
  • Familiarity with embedded systems design and low-level hardware interactions.
  • Proven track record of identifying bottlenecks and bugs, and devising solutions to these problems.
  • Hands-on Algorithm development and implementation.
  • Experience supporting and working with cross-functional teams in a dynamic environment.


Skills Required : 

  • Experience with existing computer vision toolkits such as Open-CV.
  • Current trends within Computer Vision and Image Processing in academia and community.
  • Deep learning using convolution neural networks for object classification, recognition, or sequence modeling.
  • Experience with any of the following: Object detection and target tracking, simultaneous localization and mapping, 3D reconstruction, camera calibration, behavior analysis, automated video surveillance, virtual makeup, and related fields.
  • Proficient understanding of code versioning tools, such as Git.
  • Passionate about new technology and innovation.
  • Understanding the nature of asynchronous programming and its quirks and workarounds.
  • Excellent verbal and written communication skills.



Read more
US Based Product Organization
Bengaluru (Bangalore)
10 - 15 yrs
₹25L - ₹45L / yr
Hadoop
HDFS
Apache Hive
Zookeeper
Cloudera
+8 more

Responsibilities :

  • Provide Support Services to our Gold & Enterprise customers using our flagship product suits. This may include assistance provided during the engineering and operations of distributed systems as well as responses for mission-critical systems and production customers.
  • Lead end-to-end delivery and customer success of next-generation features related to scalability, reliability, robustness, usability, security, and performance of the product
  • Lead and mentor others about concurrency, parallelization to deliver scalability, performance, and resource optimization in a multithreaded and distributed environment
  • Demonstrate the ability to actively listen to customers and show empathy to the customer’s business impact when they experience issues with our products


Requires Skills :

  • 10+ years of Experience with a highly scalable, distributed, multi-node environment (100+ nodes)
  • Hadoop operation including Zookeeper, HDFS, YARN, Hive, and related components like the Hive metastore, Cloudera Manager/Ambari, etc
  • Authentication and security configuration and tuning (KNOX, LDAP, Kerberos, SSL/TLS, second priority: SSO/OAuth/OIDC, Ranger/Sentry)
  • Java troubleshooting, e.g., collection and evaluation of jstacks, heap dumps
  • Linux, NFS, Windows, including application installation, scripting, basic command line
  • Docker and Kubernetes configuration and troubleshooting, including Helm charts, storage options, logging, and basic kubectl CLI
  • Experience working with scripting languages (Bash, PowerShell, Python)
  • Working knowledge of application, server, and network security management concepts
  • Familiarity with virtual machine technologies
  • Knowledge of databases like MySQL and PostgreSQL,
  • Certification on any of the leading Cloud providers (AWS, Azure, GCP ) and/or Kubernetes is a big plus
Read more
Hiring for a leading client
New Delhi
3 - 5 yrs
₹10L - ₹15L / yr
Big Data
Apache Kafka
Business Intelligence (BI)
Data Warehouse (DWH)
Coding
+15 more
Job Description:
Senior Software Engineer - Data Team

We are seeking a highly motivated Senior Software Engineer with hands-on experience and build scalable, extensible data solutions, identifying and addressing performance bottlenecks, collaborating with other team members, and implementing best practices for data engineering. Our engineering process is fully agile, and has a really fast release cycle - which keeps our environment very energetic and fun.

What you'll do:

Design and development of scalable applications.
Work with Product Management teams to get maximum value out of existing data.
Contribute to continual improvement by suggesting improvements to the software system.
Ensure high scalability and performance
You will advocate for good, clean, well documented and performing code; follow standards and best practices.
We'd love for you to have:

Education: Bachelor/Master Degree in Computer Science.
Experience: 3-5 years of relevant experience in BI/DW with hands-on coding experience.

Mandatory Skills

Strong in problem-solving
Strong experience with Big Data technologies, Hive, Hadoop, Impala, Hbase, Kafka, Spark
Strong experience with orchestration framework like Apache oozie, Airflow
Strong experience of Data Engineering
Strong experience with Database and Data Warehousing technologies and ability to understand complex design, system architecture
Experience with the full software development lifecycle, design, develop, review, debug, document, and deliver (especially in a multi-location organization)
Good knowledge of Java
Desired Skills

Experience with Python
Experience with reporting tools like Tableau, QlikView
Experience of Git and CI-CD pipeline
Awareness of cloud platform ex:- AWS
Excellent communication skills with team members, Business owners, across teams
Be able to work in a challenging, dynamic environment and meet tight deadlines
Read more
Loonycorn
at Loonycorn
1 recruiter
Vitthal Srinivasan
Posted by Vitthal Srinivasan
Bengaluru (Bangalore)
0 - 7 yrs
₹3L - ₹5L / yr
Data Structures
Big Data
Cloud Computing
skill iconKubernetes
skill iconJava
+1 more
About Loonycorn:

Founded by Google, Stanford, and Columbia alumni, Loonycorn is a leading studio for e-learning content on machine learning, cloud computing, blockchain, and other emerging technologies.

About the Role:

We are looking for folks to build software projects which will be part of technical content similar to what you'd find at the links below:

https://www.pluralsight.com/search?q=janani+ravi
https://www.pluralsight.com/search?q=vitthal+srinivasan
https://www.udemy.com/u/janani-ravi-2/

This involves:
- learning a new technology from scratch
- coding real-world projects which use that technology (you will be a software engineer writing code, not making slides or talking about code)

What is important to us:
- Grit - Perseverance in working on hard problems. Technical video-making is difficult and detail-oriented (that's why it is a highly profitable business)
- Craftsmanship - Our video-making is quite artisanal - lots of hard work and small details. There are many excellent roles where doing smart 80-20 trade-offs is the way to succeed - this is not one of them.
- Clarity - Talking and thinking in direct, clear ways is super-important in what we do. Folks who use a lot of jargon or cliches, or work on writing code without understanding what goes on underneath will not be a fit
- Creativity - Analogies, technical metaphors, and other artistic elements are an important part of what we do.

What is not all that important to us:
- Your school or labels: Perfectly fine whatever college or company you are applying from
- English vocabulary or pronunciation: You don't need to 'talk well' or be flashy to build good projects
Read more
EASEBUZZ
at EASEBUZZ
1 recruiter
Amala Baby
Posted by Amala Baby
Pune
2 - 4 yrs
₹2L - ₹20L / yr
Spotfire
Qlikview
Tableau
PowerBI
Data Visualization
+12 more

Company Profile:

 

Easebuzz is a payment solutions (fintech organisation) company which enables online merchants to accept, process and disburse payments through developer friendly APIs. We are focusing on building plug n play products including the payment infrastructure to solve complete business problems. Definitely a wonderful place where all the actions related to payments, lending, subscription, eKYC is happening at the same time.

 

We have been consistently profitable and are constantly developing new innovative products, as a result, we are able to grow 4x over the past year alone. We are well capitalised and have recently closed a fundraise of $4M in March, 2021 from prominent VC firms and angel investors. The company is based out of Pune and has a total strength of 180 employees. Easebuzz’s corporate culture is tied into the vision of building a workplace which breeds open communication and minimal bureaucracy. An equal opportunity employer, we welcome and encourage diversity in the workplace. One thing you can be sure of is that you will be surrounded by colleagues who are committed to helping each other grow.

 

Easebuzz Pvt. Ltd. has its presence in Pune, Bangalore, Gurugram.

 


Salary: As per company standards.

 

Designation: Data Engineering

 

Location: Pune

 

Experience with ETL, Data Modeling, and Data Architecture

Design, build and operationalize large scale enterprise data solutions and applications using one or more of AWS data and analytics services in combination with 3rd parties
- Spark, EMR, DynamoDB, RedShift, Kinesis, Lambda, Glue.

Experience with AWS cloud data lake for development of real-time or near real-time use cases

Experience with messaging systems such as Kafka/Kinesis for real time data ingestion and processing

Build data pipeline frameworks to automate high-volume and real-time data delivery

Create prototypes and proof-of-concepts for iterative development.

Experience with NoSQL databases, such as DynamoDB, MongoDB etc

Create and maintain optimal data pipeline architecture,

Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.


Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.

Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.

Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.

Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.

Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.

Evangelize a very high standard of quality, reliability and performance for data models and algorithms that can be streamlined into the engineering and sciences workflow

Build and enhance data pipeline architecture by designing and implementing data ingestion solutions.

 

Employment Type

Full-time

 

Read more
CNH Industrial
at CNH Industrial
1 recruiter
Prabhat Mishra
Posted by Prabhat Mishra
Gurugram
3 - 8 yrs
₹15L - ₹30L / yr
Big Data
skill iconScala
skill iconSpring Boot
skill iconJava

Job Description:

 

As a Software Engineer – Java, Scala, Big Data you will join a highly skilled software team in delivering innovative mobile and web applications that make up CNH Industrial’s next generation digital platform. The digital platform will enable products that integrate with connected CNH Industrial tractors, sprayers and combines and enable wide range of farm management capabilities.

 

This is an excellent opportunity to join the technology revolution currently taking place across the agricultural industry and work with highly skilled and talented people in a global, diversified company.

 

Primary responsibilities include working closely with product management, UX designers and backend developers on the design, development, testing and deployment of our next generation applications and existing product lines. You will undertake all assigned tasks and responsibilities effectively and professionally in accordance with company, team and customer expectations.

 

Essential Duties and Responsibilities

 

  1. Work in a team or individually to design, develop and test software for cloud, web and mobile
  2. Design, develop, test and document quality software to user and functional requirements within specified timeframes and in accordance with CNHI coding standards

3.    Generate rapid prototypes for feasibility testing

4.    Generate all documentation relevant to software operation

5.    Adhere to prescribed development systems, processes procedures and ensure efficient, effective, high quality delivery

6.    Communicate effectively with all stakeholders

  1. Perform tasks as specified by the Delivery Lead/Team Lead

8.    Other related duties as required

 

Competencies

 

Qualifications

To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill and/or ability required.

 

Education Qualifications and/or Experience

§ Bachelor's degree in Computer Science or Computer Engineering from an accredited university

§ 3+ years of industry experience

§ 3+ years of Java Spring Boot server application design and testing experience

§ Experience with writing server applications, working with large amount of data, Big Data processing – batch and streaming

§ Strong fundamentals on OOP, RESTful architectures, Design Patterns, Data Structures, Algorithms

§ Experience with RESTful API development

§ Experience with Microservices Development; working on Docker, Kubernetes, Helm/Terrform

§ Experience with Microsoft Azure and cloud services including exposure to PAAS services like service bus, event hub, blob stores, key vaults, API managers, Function Apps (serverless)

§ Experience with concurrency topics, asynchronous programming

 

Computer Skills

 

§ Java (Spring, Spring Boot, Hibernate) ; Scala

§ Microservices, Web Services,

§ OAuth 2.0 (JWT), Swagger, Postman, Open API Specification

§ Relational (SQL Server / Postgres); NoSQL (CosmosDB / MongoDB )

§ Big Data/Geospatial (HBase 2.1.6 (HDI 4.0, Geomesa 3.0.0)

§ Batch and streams data processing : Apache Kafka, Apache Flink, Apache Spark, Azure Service Bus

§ Caching: Redis

§ Good working knowledge of CI/CD environments (preferably Azure DevOps), Git or similar configuration management software; Build Automation (Maven)

§ Knowledge of Testing Tools such as Jasmine, Cypress, NUnit, xUnit, Junit, Testcontainers

Read more
MNC
Bengaluru (Bangalore)
3 - 9 yrs
₹8L - ₹16L / yr
Windows Azure
Hadoop
Spark
Data Structures
ADF
+1 more
  • Working knowledge of setting up and running HD insight applications
  • Hands on experience in Spark, Scala & Hive
  • Hands on experience in ADF – Azure Data Factory
  • Hands on experience in Big Data & Hadoop ECO Systems
  •  Exposure to Azure Service categories like PaaS components and IaaS subscriptions
  • Ability to Design, Develop ingestion & processing frame work for ETL applications
  • Hands on experience in powershell scripting, deployment on Azure
  • Experience in performance tuning and memory configuration
  • Should be adaptable to learn & work on new technologies
  • Should have Communication Good written and spoken
Read more
Kalibre global konnects
at Kalibre global konnects
7 recruiters
Monika Sanghvi
Posted by Monika Sanghvi
Ahmedabad
1 - 4 yrs
₹3L - ₹5L / yr
skill iconPython
skill iconR Programming
skill iconData Science
skill iconData Analytics
Big Data
Job Description:-
 Job Title:- Assistant Manager - Business Analytics
 Age: Max. 35years.
 Working Days:- 6 days a week
 Location:- Ahmedabad, Gujarat
 Monthly CTC:- Salary will commensurate with experience.
 Educational Qualification:- The candidate should have bachelor’s degree in
IT/Engineering from any recognized university..
 Experience:- 2+ years of work experience in AI/ML/business analytics with Institute of
repute or corporate.
Required Technical Skills:-
 A fair bit of understanding of Business Analytics, Data Science, Visualization/Big Data
etc.
 Basic knowledge of different analytical tools such as R Programming, Python etc.
 Hands on experience in Moodle Development (desirable).
 Good knowledge in customizing Moodle functionalities and developing custom themes
for Moodle (desirable).
 An analytical mind-set who enjoy helping participants solving problems and turning data
into useful actionable information
Key Responsibilities include:-
 Understand the tools and technologies specific to e-learning and blended learning
development and delivery.
 Provide academic as well as technical assistance to the faculty members teaching the
analytics courses.
 Working closely with the Instructors and assisting them in programming, coding, testing
etc.
 Preparing the lab study material in coordination with the Instructors and assisting
students in programming lab and solving their doubts.
 Works on assignments dealing with the routine and daily operation, use, and
configuration of the Learning Management System (LMS).
 Administers learning technology platforms including the creation of courses,
certifications and other e-learning programs on the platforms.
 Responsible to provide support within the eLearning department, provide technical
support to our external clients, and administrate the Learning Management System.
 Creates user groups and assigns content and assessments to the right target audience,
runs reports and creates learning evens in the LMS system.
 Performs regular maintenance of LMS database, including adding or removing courses.
 Uploads, tests, deploys and maintains all training materials/learning assets hosted in the
LMS.
 Ability to Multi-task.
 Ability to demonstrate accuracy on detailed oriented and repetitive job assignments.
 Responsible and reliable
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos