Cutshort logo
Helical IT Solutions logo
Project Engineer Intern
Project Engineer Intern
Helical IT Solutions's logo

Project Engineer Intern

Bhavani Thanga's profile picture
Posted by Bhavani Thanga
0 - 0 yrs
₹1.2L - ₹3.5L / yr
Hyderabad
Skills
Spark
Hadoop
Big Data
Data engineering
PySpark
Hibernate (Java)
Jasmine (Javascript Testing Framework)
SQL
Python

Job description

About Company
Helical Insight an open source Business Intelligence tool from Helical IT Solutions Pvt. Ltd,
based out of Hyderabad, is looking for fresher’s having strong knowledge on SQL. Helical
Insight has more than 50+ clients from various sectors. It has been awarded the most promising
company in the Business Intelligence space. We are looking for rockstar team mate to join
our company.
Job Brief
We are looking for a Business Intelligence (BI) Developer to create and manage BI and analytics
solutions that turn data into knowledge.
In this role, you should have a background in data and business analysis. You should be
analytical and an excellent communicator. If you also have a business acumen and
problemsolving aptitude, we’d like to meet you. Excellent knowledge on SQLQuery is required.
Basic knowledge on HTML CSS and JS is required.
You would be working closely with customers of various domain to understand their data,
understand their business requirement and deliver the required analytics in form of varous
reports dashboards etc. Excellent client interfacing role with opportunity to work across various
sectors and geographies as well as varioud kind of DB including NoSQL, RDBMS, graph db,
Columnar DB etc
Skill set and Qualification required
Responsibilities
 Attending client calls to get requriement, show progress
 Translate business needs to technical specifications
 Design, build and deploy BI solutions (e.g. reporting tools)
 Maintain and support data analytics platforms)
 Conduct unit testing and troubleshooting
 Evaluate and improve existing BI systems
 Collaborate with teams to integrate systems
 Develop and execute database queries and conduct analyses
 Create visualizations and reports for requested projects
 Develop and update technical documentation
Requirements
 Excellent expertise on SQLQueries
 Proven experience as a BI Developer or Data Scientist
 Background in data warehouse design (e.g. dimensional modeling) and data mining
 In-depth understanding of database management systems, online analytical processing
(OLAP) and ETL (Extract, transform, load) framework
 Familiarity with BI technologies
 Proven abilities to take initiative and be innovative
 Analytical mind with a problem-solving aptitude
 BE in Computer Science/IT
Education: BE/ BTech/ MCA/BCA/ MTech/ MS, or equivalent preferred.
Interested candidates call us on +91 7569 765 162
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About Helical IT Solutions

Founded :
2011
Type
Size :
20-100
Stage :
Profitable
About
Helical IT Solutions Pvt Ltd specializes in Data Warehousing, Business Intelligence and Big Data Analytics. We offer consultation in selection of correct hardware and software as per requirement, implementation of data warehouse modeling, big data, data processing using Apache Spark or ETL tools and building data analysis in the form of reports and dashboards with supporting features such as data security, alerts and notification, etc.
Read more
Connect with the team
Profile picture
Bhavani Thanga
Profile picture
Niyotee Gupta
Profile picture
Monica Patidar
Profile picture
Sukhitha Sadasivuni
Company social profiles
bloglinkedintwitter

Similar jobs

Remote only
3 - 7 yrs
₹15L - ₹24L / yr
Data Science
Machine Learning (ML)
Artificial Intelligence (AI)
Natural Language Processing (NLP)
R Programming
+4 more

  Senior Data Scientist

  • 6+ years Experienced in building data pipelines and deployment pipelines for machine learning models
  • 4+ years’ experience with ML/AI toolkits such as Tensorflow, Keras, AWS Sagemaker, MXNet, H20, etc.
  • 4+ years’ experience developing ML/AI models in Python/R
  • Must have leadership abilities to lead a project and team.
  • Must have leadership skills to lead and deliver projects, be proactive, take ownership, interface with business, represent the team and spread the knowledge.
  • Strong knowledge of statistical data analysis and machine learning techniques (e.g., Bayesian, regression, classification, clustering, time series, deep learning).
  • Should be able to help deploy various models and tune them for better performance.
  • Working knowledge in operationalizing models in production using model repositories, API s and data pipelines.
  • Experience with machine learning and computational statistics packages.
  • Experience with Data Bricks, Data Lake.
  • Experience with Dremio, Tableau, Power Bi.
  • Experience working with spark ML, spark DL with Pyspark would be a big plus!
  • Working knowledge of relational database systems like SQL Server, Oracle.
  • Knowledge of deploying models in platforms like PCF, AWS, Kubernetes.
  • Good knowledge in Continuous integration suites like Jenkins.
  • Good knowledge in web servers (Apache, NGINX).
  • Good knowledge in Git, Github, Bitbucket.
  • Working knowledge in operationalizing models in production using model repositories, APIs and data pipelines.
  • Java, R, and Python programming experience.
  • Should be very familiar with (MS SQL, Teradata, Oracle, DB2).
  • Big Data – Hadoop.
  • Expert knowledge using BI tools e.g.Tableau
  • Experience with machine learning and computational statistics packages.

 

Read more
Mumbai
3 - 9 yrs
₹30L - ₹40L / yr
Machine Learning (ML)
Natural Language Processing (NLP)
Deep Learning
Data Science
Computer Vision
+4 more

Responsibilities


  • Work on execution and scheduling of all tasks related to assigned projects' deliverable dates
  • Optimize and debug existing codes to make them scalable and improve performance
  • Design, development, and delivery of tested code and machine learning models into production environments
  • Work effectively in teams, managing and leading teams
  • Provide effective, constructive feedback to the delivery leader
  • Manage client expectations and work with an agile mindset with machine learning and AI technology
  • Design and prototype data-driven solutions

Eligibility


  • Highly experienced in designing, building, and shipping scalable and production-quality machine learning algorithms in the field of Python applications
  • Working knowledge and experience in NLP core components (NER, Entity Disambiguation, etc.)
  • In-depth expertise in Data Munging and Storage (Experienced in SQL, NoSQL, MongoDB, Graph Databases)
  • Expertise in writing scalable APIs for machine learning models
  • Experience with maintaining code logs, task schedulers, and security
  • Working knowledge of machine learning techniques, feed-forward, recurrent and convolutional neural networks, entropy models, supervised and unsupervised learning
  • Experience with at least one of the following: Keras, Tensorflow, Caffe, or PyTorch
Read more
Slintel
Agency job
via Qrata by Prajakta Kulkarni
Bengaluru (Bangalore)
4 - 9 yrs
₹20L - ₹28L / yr
Big Data
ETL
Apache Spark
Spark
Data engineer
+5 more
Responsibilities
  • Work in collaboration with the application team and integration team to design, create, and maintain optimal data pipeline architecture and data structures for Data Lake/Data Warehouse.
  • Work with stakeholders including the Sales, Product, and Customer Support teams to assist with data-related technical issues and support their data analytics needs.
  • Assemble large, complex data sets from third-party vendors to meet business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL, Elasticsearch, MongoDB, and AWS technology.
  • Streamline existing and introduce enhanced reporting and analysis solutions that leverage complex data sources derived from multiple internal systems.

Requirements
  • 5+ years of experience in a Data Engineer role.
  • Proficiency in Linux.
  • Must have SQL knowledge and experience working with relational databases, query authoring (SQL) as well as familiarity with databases including Mysql, Mongo, Cassandra, and Athena.
  • Must have experience with Python/Scala.
  • Must have experience with Big Data technologies like Apache Spark.
  • Must have experience with Apache Airflow.
  • Experience with data pipeline and ETL tools like AWS Glue.
  • Experience working with AWS cloud services: EC2, S3, RDS, Redshift.
Read more
Bengaluru (Bangalore)
3 - 10 yrs
₹30L - ₹50L / yr
Data engineering
Data modeling
Python

Requirements:

  • 2+ years of experience (4+ for Senior Data Engineer) with system/data integration, development or implementation of enterprise and/or cloud software Engineering degree in Computer Science, Engineering or related field.
  • Extensive hands-on experience with data integration/EAI technologies (File, API, Queues, Streams), ETL Tools and building custom data pipelines.
  • Demonstrated proficiency with Python, JavaScript and/or Java
  • Familiarity with version control/SCM is a must (experience with git is a plus).
  • Experience with relational and NoSQL databases (any vendor) Solid understanding of cloud computing concepts.
  • Strong organisational and troubleshooting skills with attention to detail.
  • Strong analytical ability, judgment and problem-solving techniques Interpersonal and communication skills with the ability to work effectively in a cross functional team.


Read more
Kloud9 Technologies
Bengaluru (Bangalore)
3 - 6 yrs
₹5L - ₹20L / yr
Amazon Web Services (AWS)
Amazon EMR
EMR
Spark
PySpark
+9 more

About Kloud9:

 

Kloud9 exists with the sole purpose of providing cloud expertise to the retail industry. Our team of cloud architects, engineers and developers help retailers launch a successful cloud initiative so you can quickly realise the benefits of cloud technology. Our standardised, proven cloud adoption methodologies reduce the cloud adoption time and effort so you can directly benefit from lower migration costs.

 

Kloud9 was founded with the vision of bridging the gap between E-commerce and cloud. The E-commerce of any industry is limiting and poses a huge challenge in terms of the finances spent on physical data structures.

 

At Kloud9, we know migrating to the cloud is the single most significant technology shift your company faces today. We are your trusted advisors in transformation and are determined to build a deep partnership along the way. Our cloud and retail experts will ease your transition to the cloud.

 

Our sole focus is to provide cloud expertise to retail industry giving our clients the empowerment that will take their business to the next level. Our team of proficient architects, engineers and developers have been designing, building and implementing solutions for retailers for an average of more than 20 years.

 

We are a cloud vendor that is both platform and technology independent. Our vendor independence not just provides us with a unique perspective into the cloud market but also ensures that we deliver the cloud solutions available that best meet our clients' requirements.


What we are looking for:

● 3+ years’ experience developing Data & Analytic solutions

● Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive& Spark

● Experience with relational SQL

● Experience with scripting languages such as Shell, Python

● Experience with source control tools such as GitHub and related dev process

● Experience with workflow scheduling tools such as Airflow

● In-depth knowledge of scalable cloud

● Has a passion for data solutions

● Strong understanding of data structures and algorithms

● Strong understanding of solution and technical design

● Has a strong problem-solving and analytical mindset

● Experience working with Agile Teams.

● Able to influence and communicate effectively, both verbally and written, with team members and business stakeholders

● Able to quickly pick up new programming languages, technologies, and frameworks

● Bachelor’s Degree in computer science


Why Explore a Career at Kloud9:

 

With job opportunities in prime locations of US, London, Poland and Bengaluru, we help build your career paths in cutting edge technologies of AI, Machine Learning and Data Science. Be part of an inclusive and diverse workforce that's changing the face of retail technology with their creativity and innovative solutions. Our vested interest in our employees translates to deliver the best products and solutions to our customers.

Read more
Novo
at Novo
2 recruiters
Dishaa Ranjan
Posted by Dishaa Ranjan
Bengaluru (Bangalore), Gurugram
4 - 6 yrs
₹25L - ₹35L / yr
SQL
Python
pandas
Scikit-Learn
TensorFlow
+1 more

About Us: 

Small businesses are the backbone of the US economy, comprising almost half of the GDP and the private workforce. Yet, big banks don’t provide the access, assistance and modern tools that owners need to successfully grow their business. 


We started Novo to challenge the status quo—we’re on a mission to increase the GDP of the modern entrepreneur by creating the go-to banking platform for small businesses (SMBs). Novo is flipping the script of the banking world, and we’re excited to lead the small business banking revolution.


At Novo, we’re here to help entrepreneurs, freelancers, startups and SMBs achieve their financial goals by empowering them with an operating system that makes business banking as easy as iOS. We developed modern bank accounts and tools to help to save time and increase cash flow. Our unique product integrations enable easy access to tracking payments, transferring money internationally, managing business transactions and more. We’ve made a big impact in a short amount of time, helping thousands of organizations access powerfully simple business banking.  



We are looking for a Senior Data Scientist who is enthusiastic about using data and technology to solve complex business problems. If you're passionate about leading and helping to architect and develop thoughtful data solutions, then we want to chat. Are you ready to revolutionize the small business banking industry with us?


About the Role: (specific to the role-- describe the role activities/duties, who they interact with, what they are accountable for, how the role operates in the team, department and organization)


  • Build and manage predictive models focussed on credit risk, fraud, conversions, churn, consumer behaviour etc
  • Provides best practices, direction for data analytics and business decision making across multiple projects and functional areas
  • Implements performance optimizations and best practices for scalable data models, pipelines and modelling
  • Resolve blockers and help the team stay productive
  • Take part in building the team and iterating on hiring processes

Requirements for the Role: (these are specific to the role-- technical skills and requirements to fulfill the job duties, certifications, years of experience, degree)


  • 4+ years of experience in data science roles focussed on managing data processes, modelling and dashboarding
  • Strong experience in python, SQL and in-depth understanding of modelling techniques
  • Experience working with Pandas, scikit learn, visualization libraries like plotly, bokeh etc.
  • Prior experience with credit risk modelling will be preferred
  • Deep Knowledge of Python to write scripts to manipulate data and generate automated  reports

How We Define Success: (these are specific to the role-- should be tied to performance management, OKRs or general goals)


  • Expand access to data driven decision making across the organization
  • Solve problems in risk, marketing, growth, customer behaviour through analytics models that increase efficacy

Nice To Have, but Not Required:

  • Experience in dashboarding libraries like Python Dash and exposure to CI/CD 
  • Exposure to big data tools like Spark, and some core tech knowledge around API’s, data streaming etc.


Novo values diversity as a core tenant of the work we do and the businesses we serve. We are an equal opportunity employer, indiscriminate of race, religion, ethnicity, national origin, citizenship, gender, gender identity, sexual orientation, age, veteran status, disability, genetic information or any other protected characteristic. 

Read more
company logo
Agency job
via Morine tech by Sridevi L
Bengaluru (Bangalore)
4 - 9 yrs
₹10L - ₹25L / yr
Python
Vertex
Google Cloud Platform (GCP)
BigQuery
  • Cloud: GCP
  • Must have: BigQuery, Python, Vertex AI
  • Nice to have Services: Data Plex
  • Exp level: 5-10 years.
  • Preferred Industry (nice to have): Manufacturing – B2B sales
Read more
8om Internet
at 8om Internet
1 recruiter
Harsh Maur
Posted by Harsh Maur
Remote only
1 - 3 yrs
₹2.4L - ₹3.5L / yr
Web Scraping
Python
Selenium
Scrapy
Job Description

1. Use Python Scrapy to crawl the website
2. Work on dynamic websites and solve crawling challenges
3. Work in a fast-paced startup environment
Read more
UAE Client
Remote, Bengaluru (Bangalore), Hyderabad
6 - 10 yrs
₹15L - ₹22L / yr
Informatica
Big Data
SQL
Hadoop
Apache Spark
+1 more

Skills- Informatica with Big Data Management

 

1.Minimum 6 to 8 years of experience in informatica BDM development
2.Experience working on Spark/SQL
3.Develops informtica mapping/Sql 

4. Should have experience in Hadoop, spark etc
Read more
Freelancer
at Freelancer
4 recruiters
Nirmala Hk
Posted by Nirmala Hk
Bengaluru (Bangalore)
4 - 7 yrs
₹20L - ₹35L / yr
Python
Shell Scripting
MySQL
SQL
Amazon Web Services (AWS)
+3 more

   3+ years of experience in deployment, monitoring, tuning, and administration of high concurrency MySQL production databases.

  • Solid understanding of writing optimized SQL queries on MySQL databases
  • Understanding of AWS, VPC, networking, security groups, IAM, and roles.
  • Expertise in scripting in Python or Shell/Powershell
  • Must have experience in large scale data migrations
  • Excellent communication skills.
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos