
Data Scientists are analytical experts who use their skills in mathematics, statistics, and computer science to solve complex business problems. They collect, analyze, and interpret large datasets to extract meaningful insights and develop data-driven solutions. Their work helps organizations make informed decisions, improve processes, and gain a competitive edge.
Key Responsibilities:
- Data Collection and Preprocessing:
- Gathering data from various sources (databases, APIs, web scraping, etc.).
- Cleaning and transforming data to ensure quality and consistency.
- Handling missing values, outliers, and inconsistencies.
- Integrating data from multiple sources.
- Data Analysis and Exploration:
- Conducting exploratory data analysis (EDA) to identify patterns, trends, and relationships.
- Applying statistical methods and hypothesis testing to validate findings.
- Visualizing data using charts, graphs, and other tools to communicate insights.
- Model Development and Deployment:
- Developing and implementing machine learning models (e.g., regression, classification, clustering, deep learning).
- Evaluating model performance and fine-tuning parameters.
- Deploying models into production environments.
- Creating and maintaining machine learning pipelines.
- Communication and Collaboration:
- Presenting findings and recommendations to stakeholders in a clear and concise manner.
- Collaborating with cross-functional teams (e.g., engineers, product managers, business analysts).
- Documenting code, models, and processes.
- Translating business requirements into technical implementations.

About HyperNovas Tech
About
Company social profiles
Similar jobs
- Exploratory tester with 2–3 years of experience in software testing.
- The candidate should be an expert in GUI and functional testing of web applications.
- Good communication is a must and should be capable of collaborating with cross functional teams.
- Should be self driven capable of handling responsibilities independently.
- Should have good knowledge of SQL and Jira
- Strong proficiency in Microsoft Excel is required for test analysis and reporting.
- Should be able to understand application architecture to effectively design and execute test scenarios.
- Experience with Playwright automation is an added advantage but not mandatory.
JOB Requirements and Responsibilities:
#SeniorSystemadministrator
- #ActiveDirectory Domain, #GroupPolicies, #Domaincontroller migration and upgrades.
- File and Print sharing, #NTFS permissions. #FileServer #migrations.
- #MicrosoftExchange or #Office365 messaging, #Outlook Configurations.
- Knowledge of Data #Backups, Backup Strategies, Experience on #backuptools will be an additional advantage.
- Basic knowledge of #Routers, #Firewalls, NAT, #VPN configuration #Sonicwalll preferable.
- Knowledge and working experience on #TicketingSystems & #RemoteAdministration tools.
- Good #DesktopTroubleshooting experience.
- #AntiVirus installations and #Troubleshooting.
- Knowledge of #DHCP , #DNS Management.
- Ticketing tool and #RMM tool #Labtech, #Kaseya, #Autotask (Experience preferred)
Where: Hyderabad/ Bengaluru, India (Hybrid Mode 3 Days/Week in Office)
Job Description:
- Collaborate with stakeholders to develop a data strategy that meets enterprise needs and industry requirements.
- Create an inventory of the data necessary to build and implement a data architecture.
- Envision data pipelines and how data will flow through the data landscape.
- Evaluate current data management technologies and what additional tools are needed.
- Determine upgrades and improvements to current data architectures.
- Design, document, build and implement database architectures and applications. Should have hands-on experience in building high scale OLAP systems.
- Build data models for database structures, analytics, and use cases.
- Develop and enforce database development standards with solid DB/ Query optimizations capabilities.
- Integrate new systems and functions like security, performance, scalability, governance, reliability, and data recovery.
- Research new opportunities and create methods to acquire data.
- Develop measures that ensure data accuracy, integrity, and accessibility.
- Continually monitor, refine, and report data management system performance.
Required Qualifications and Skillset:
- Extensive knowledge of Azure, GCP clouds, and DataOps Data Eco-System (super strong in one of the two clouds and satisfactory in the other one)
- Hands-on expertise in systems like Snowflake, Synapse, SQL DW, BigQuery, and Cosmos DB. (Expertise in any 3 is a must)
- Azure Data Factory, Dataiku, Fivetran, Google Cloud Dataflow (Any 2)
- Hands-on experience in working with services/technologies like - Apache Airflow, Cloud Composer, Oozie, Azure Data Factory, and Cloud Data Fusion (Expertise in any 2 is required)
- Well-versed with Data services, integration, ingestion, ELT/ETL, Data Governance, Security, and Meta-driven Development.
- Expertise in RDBMS (relational database management system) – writing complex SQL logic, DB/Query optimization, Data Modelling, and managing high data volume for mission-critical applications.
- Strong grip on programming using Python and PySpark.
- Clear understanding of data best practices prevailing in the industry.
- Preference to candidates having Azure or GCP architect certification. (Either of the two would suffice)
- Strong networking and data security experience.
Awareness of the Following:
- Application development understanding (Full Stack)
- Experience on open-source tools like Kafka, Spark, Splunk, Superset, etc.
- Good understanding of Analytics Platform Landscape that includes AI/ML
- Experience in any Data Visualization tool like PowerBI / Tableau / Qlik /QuickSight etc.
About Us
Gramener is a design-led data science company. We build custom Data & AI solutions that help solve complex business problems with actionable insights and compelling data stories. We partner with enterprise data and digital transformation teams to improve the data-driven decision-making culture across the organization. Our open standard low-code platform, Gramex, rapidly builds engaging Data & AI solutions across multiple business verticals and use cases. Our solutions and technology have been recognized by analysts such as Gartner and Forrester and have won several awards.
We Offer You:
- a chance to try new things & take risks.
- meaningful problems you'll be proud to solve.
- people you will be comfortable working with.
- transparent and innovative work environment.
To know more about us visit Gramener Website and Gramener Blog.
If anyone looking for the same, kindly share below mentioned details.
Total Experience:
Relevant Experience:
Notice Period:
CTCT:
ECTC:
Current Location:
- Build and Lead multiple teams of notch engineers to own, drive & deliver critical parts of our products.
- Work closely with Engineering Managers to develop the best technical design and approach for new product development.
- Set up the organization & processes to enable timely delivery of projects with high quality.
- Set up best practices for development and champion their adoption.
- Oversee Architecture & design of technically robust, flexible and scalable solutions.
improvement.
- Show strong business and technical judgment that will accelerate time to market of releases, while incrementally moving our services towards the long-term vision.
- Be responsible for mentoring and developing front line managers and engineers.
Requirement:
- Bachelor Degree or higher in Computer Science or related field from Premium Institutes
- 5+ years of work experience in software development and 1-2 years as a leadership role.
- Deep Understanding of enterprise grade technologies.
- Experience in Node.Js, Java, Python tech stack.
- Knowledge of Object-Oriented Design, data structures, algorithm design, and complexity analysis.
-
- Strong analytic and quantitative skills; ability to use hard data and metrics to back up assumptions, recommendations, and drive actions.

Experience with Xcode, Cocoa Touch, Swift, third party Framework or libraries and iPhone SDK and React JS.
-Proficient with Swift, and Cocoa Touch.
-Good with OOPs design.
-Experience with IOS frameworks such as Core Data, Core Animation, Core Bluetooth, cocoa pods, etc.
-Experience with offline storage, threading, and performance tuning.
-Familiarity with REST APIs to connect IOS applications to back-end services.
-Understanding of apple design principles and interface guidelines.
-Experience with performance and memory tuning with tools {such as Instruments and Shark, depending on project needs}
-Familiarity with cloud message APIs and push notifications.
-Experience working with Google Map API.
-Must understand REST, JSON









