Cutshort logo
I Base IT logo
Data Architect
Data Architect
I Base IT's logo

Data Architect

Sravanthi Alamuri's profile picture
Posted by Sravanthi Alamuri
9 - 13 yrs
₹10L - ₹23L / yr
Hyderabad
Skills
skill iconData Analytics
Data Warehouse (DWH)
Data Structures
Spark
Architecture
cube building
data lake
Hadoop
skill iconJava
Data Architect who leads a team of 5 numbers. Required skills : Spark ,Scala , hadoop
Read more
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos

About I Base IT

Founded :
2011
Type
Size
Stage :
Raised funding
About
We are a custom software development company. We understand the challenges faced in implementing the dream projects. We know how to overcome obstacles.
Read more
Connect with the team
Profile picture
Sravanthi Alamuri
Company social profiles
linkedinfacebook

Similar jobs

a global business process management company
a global business process management company
Agency job
via Jobdost by Saida Jabbar
Bengaluru (Bangalore)
3 - 8 yrs
₹14L - ₹20L / yr
Business Intelligence (BI)
PowerBI
Windows Azure
skill iconGit
SVN
+9 more

Power BI Developer(Azure Developer )

Job Description:

Senior visualization engineer with understanding in Azure Data Factory & Databricks to develop and deliver solutions that enable delivery of information to audiences in support of key business processes.

Ensure code and design quality through execution of test plans and assist in development of standards & guidelines working closely with internal and external design, business and technical counterparts.

 

Desired Competencies:

  • Strong designing concepts of data visualization centered on business user and a knack of communicating insights visually.
  • Ability to produce any of the charting methods available with drill down options and action-based reporting. This includes use of right graphs for the underlying data with company themes and objects.
  • Publishing reports & dashboards on reporting server and providing role-based access to users.
  • Ability to create wireframes on any tool for communicating the reporting design.
  • Creation of ad-hoc reports & dashboards to visually communicate data hub metrics (metadata information) for top management understanding.
  • Should be able to handle huge volume of data from databases such as SQL Server, Synapse, Delta Lake or flat files and create high performance dashboards.
  • Should be good in Power BI development
  • Expertise in 2 or more BI (Visualization) tools in building reports and dashboards.
  • Understanding of Azure components like Azure Data Factory, Data lake Store, SQL Database, Azure Databricks
  • Strong knowledge in SQL queries
  • Must have worked in full life-cycle development from functional design to deployment
  • Intermediate understanding to format, process and transform data
  • Should have working knowledge of GIT, SVN
  • Good experience in establishing connection with heterogeneous sources like Hadoop, Hive, Amazon, Azure, Salesforce, SAP, HANA, API’s, various Databases etc.
  • Basic understanding of data modelling and ability to combine data from multiple sources to create integrated reports

 

Preferred Qualifications:

  • Bachelor's degree in Computer Science or Technology
  • Proven success in contributing to a team-oriented environment
Read more
Publicis Sapient
at Publicis Sapient
10 recruiters
Mohit Singh
Posted by Mohit Singh
Bengaluru (Bangalore), Gurugram, Pune, Hyderabad, Noida
4 - 10 yrs
Best in industry
PySpark
Data engineering
Big Data
Hadoop
Spark
+6 more

Publicis Sapient Overview:

The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution 

.

Job Summary:

As Senior Associate L1 in Data Engineering, you will do technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution

The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. Having hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms will be preferable.


Role & Responsibilities:

Job Title: Senior Associate L1 – Data Engineering

Your role is focused on Design, Development and delivery of solutions involving:

• Data Ingestion, Integration and Transformation

• Data Storage and Computation Frameworks, Performance Optimizations

• Analytics & Visualizations

• Infrastructure & Cloud Computing

• Data Management Platforms

• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time

• Build functionality for data analytics, search and aggregation


Experience Guidelines:

Mandatory Experience and Competencies:

# Competency

1.Overall 3.5+ years of IT experience with 1.5+ years in Data related technologies

2.Minimum 1.5 years of experience in Big Data technologies

3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline. Working knowledge on real-time data pipelines is added advantage.

4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable

5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc


Preferred Experience and Knowledge (Good to Have):

# Competency

1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience

2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc

3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures

4.Performance tuning and optimization of data pipelines

5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality

6.Working knowledge with data platform related services on at least 1 cloud platform, IAM and data security

7.Cloud data specialty and other related Big data technology certifications


Job Title: Senior Associate L1 – Data Engineering

Personal Attributes:

• Strong written and verbal communication skills

• Articulation skills

• Good team player

• Self-starter who requires minimal oversight

• Ability to prioritize and manage multiple tasks

• Process orientation and the ability to define and set up processes

Read more
Hyderabad
6 - 9 yrs
₹10L - ₹15L / yr
SQL
Databases
SQL Server Reporting Services (SSRS)
SQL Server Integration Services (SSIS)
SQL Server Analysis Services (SSAS)
+11 more

Designation: Senior - DBA

Experience: 6-9 years

CTC: INR 17-20 LPA

Night Allowance: INR 800/Night

Location: Hyderabad,Hybrid

Notice Period: NA

Shift Timing : 6:30 pm to 3:30 am

Openings: 3

Roles and Responsibilities:

As a Senior Database Administrator is responsible for the physical design development

administration and optimization of properly engineered database systems to meet agreed

business and technical requirements.

The candidate will work as part of but not limited to the Onsite/Offsite DBA

group-Administration and management of databases in Dev Stage and Production

environments

Performance tuning of database schema stored procedures etc.

Providing technical input on the setup configuration of database servers and SAN disk

subsystem on all database servers.

Troubleshooting and handling all database related issues and tracking them through to

resolution.

Pro-active monitoring of databases both from a performance and capacity management

perspective.

Performing database maintenance activities such as backup/recovery rebuilding and

reorganizing indexes.

Ensuring that all database releases are properly assessed and measured from a

functionality and performance perspective.

Ensuring that all databases are up to date with the latest service packs patches &

security fixes.

Take ownership and ensure high quality timely delivery of projects on hand.

Collaborate with application/database developers quality assurance and

operations/support staff

Will help manage large high transaction rate SQL Server production

Eligibility:

Bachelors/Master Degree (BE/BTech/MCA/MTect/MS)

6 - 8 years of solid experience in SQL Server 2016/2019 Database administration and

maintenance on Azure and AWS cloud.

Experience handling and managing large SQL Server databases in a real time production

environment with sizes greater than 200+ GB

Experience in troubleshooting and resolving database integrity issues performance

issues blocking/deadlocking issues connectivity issues data replication issues etc.

Experience on Configuration Trouble shoot on SQL Server HA

Ability to detect and troubleshoot database related CPUmemoryI/Odisk space and other

resource contention issues.

Experience with database maintenance activities such as backup/recovery & capacity

monitoring/management and Azure Backup Services.

Experience with HA/Failover technologies such as Clustering SAN Replication Log

shipping & mirroring.

Experience collaborating with development teams on physical database design activities

and performance tuning.

Experience in managing and making software deployments/changes in real time

production environments.

Ability to work on multiple projects at one time with minimal supervision and ensure high

quality timely delivery.

Knowledge on tools like SQL Lite speed SQL Diagnostic Manager App Dynamics.

Strong understanding of Data Warehousing concepts and SQL server Architecture

Certified DBA Proficient in TSQL Proficient in the various Storage technologies such as

ASM SAN NAS RAID Multi patching

Strong analytical and problem solving skills Proactive independent and proven ability to

work under tight target and pressure

Experience working in a highly regulated environment such as a financial services

institutions

Expertise in SSIS SSRS

Skills:

SSIS

SSRS


Read more
Amazech Systems pvt Ltd
Priyanga Eswaramoorthy
Posted by Priyanga Eswaramoorthy
Remote only
5 - 9 yrs
₹2L - ₹12L / yr
Azure Data factory, SQL, Python
Data Warehouse (DWH)
Informatica
ETL
Azure data factory, Azure data lake, python, SQL

Location: Bangalore / Pune (Remote)

Employment type:  Full time

Permanent website: www.amazech.com  

 

 Qualifications:   

 

               B.E./B.Tech/M.E./M.Tech in Computer Science, Information Technology, Electrical or Electronic with good academic background.   

 

 

Experience and Required Skill Sets:  

 

·        Minimum 5 years of hands on experience with Azure Data Lake, Azure Data Factory, SQL Data Warehouse,Azure Blob, Azure Storage Explorer 

·        Experience in Data warehouse/analytical systems using Azure Synapse. 

·        Proficient in creating Azure Data Factory pipelines for ETL processing; copy activity, custom Azure development etc. 

·        Knowledge of Azure Data Catalog, Event Grid, Service Bus, SQL, Purview and Synapse 

·        Good technical knowledge in Microsoft SQL Server BI Suite (ETL, Reporting, Analytics, Dashboards) using SSIS, SSAS, SSRS, Power BI 

·        Design and develop batch and real-time streaming of data loads to data warehouse systems 

 

  Other Requirements: 

·        A Bachelor's or Master's degree (Engineering or computer related degree preferred) 

·        Strong understanding of Software Development Life Cycles including Agile/Scrum  

  Responsibilities:   

·        Ability to create complex, enterprise-transforming applications that meet and exceed client expectations.   

·        Responsible for bottom line. Strong project management abilities. Ability to encourage team to stick to timelines.  

·        Should demonstrate strong client interfacing capabilities like emails and calls with clients in the US/UK


Read more
Synechron
at Synechron
3 recruiters
Ranjini N
Posted by Ranjini N
Bengaluru (Bangalore), Hyderabad
6 - 10 yrs
₹2L - ₹15L / yr
ETL
Informatica
Data Warehouse (DWH)
skill iconPython
Shell Scripting
+2 more

Position: ETL Developer

Location: Mumbai

Exp.Level: 4+ Yrs

Required Skills:

* Strong scripting knowledge such as: Python and Shell

* Strong relational database skills especially with DB2/Sybase

* Create high quality and optimized stored procedures and queries

* Strong with scripting language such as Python and Unix / K-Shell

* Strong knowledge base of relational database performance and tuning such as: proper use of indices, database statistics/reorgs, de-normalization concepts.

* Familiar with lifecycle of a trade and flows of data in an investment banking operation is a plus.

* Experienced in Agile development process

* Java Knowledge is a big plus but not essential

* Experience in delivery of metrics / reporting in an enterprise environment (e.g. demonstrated experience in BI tools such as Business Objects, Tableau, report design & delivery) is a plus

* Experience on ETL processes and tools such as Informatica is a plus. Real time message processing experience is a big plus.

* Good team player; Integrity & ownership

Read more
Orboai
at Orboai
4 recruiters
Hardika Bhansali
Posted by Hardika Bhansali
Noida, Mumbai
1 - 3 yrs
₹6L - ₹15L / yr
TensorFlow
OpenCV
OCR
PyTorch
Keras
+10 more

Who Are We

 

A research-oriented company with expertise in computer vision and artificial intelligence, at its core, Orbo is a comprehensive platform of AI-based visual enhancement stack. This way, companies can find a suitable product as per their need where deep learning powered technology can automatically improve their Imagery.

 

ORBO's solutions are helping BFSI, beauty and personal care digital transformation and Ecommerce image retouching industries in multiple ways.

 

WHY US

  • Join top AI company
  • Grow with your best companions
  • Continuous pursuit of excellence, equality, respect
  • Competitive compensation and benefits

You'll be a part of the core team and will be working directly with the founders in building and iterating upon the core products that make cameras intelligent and images more informative.

 

To learn more about how we work, please check out

https://www.orbo.ai/.

 

Description:

We are looking for a computer vision engineer to lead our team in developing a factory floor analytics SaaS product. This would be a fast-paced role and the person will get an opportunity to develop an industrial grade solution from concept to deployment.

 

Responsibilities:

  • Research and develop computer vision solutions for industries (BFSI, Beauty and personal care, E-commerce, Defence etc.)
  • Lead a team of ML engineers in developing an industrial AI product from scratch
  • Setup end-end Deep Learning pipeline for data ingestion, preparation, model training, validation and deployment
  • Tune the models to achieve high accuracy rates and minimum latency
  • Deploying developed computer vision models on edge devices after optimization to meet customer requirements

 

 

Requirements:

  • Bachelor’s degree
  • Understanding about depth and breadth of computer vision and deep learning algorithms.
  • Experience in taking an AI product from scratch to commercial deployment.
  • Experience in Image enhancement, object detection, image segmentation, image classification algorithms
  • Experience in deployment with OpenVINO, ONNXruntime and TensorRT
  • Experience in deploying computer vision solutions on edge devices such as Intel Movidius and Nvidia Jetson
  • Experience with any machine/deep learning frameworks like Tensorflow, and PyTorch.
  • Proficient understanding of code versioning tools, such as Git

Our perfect candidate is someone that:

  • is proactive and an independent problem solver
  • is a constant learner. We are a fast growing start-up. We want you to grow with us!
  • is a team player and good communicator

 

What We Offer:

  • You will have fun working with a fast-paced team on a product that can impact the business model of E-commerce and BFSI industries. As the team is small, you will easily be able to see a direct impact of what you build on our customers (Trust us - it is extremely fulfilling!)
  • You will be in charge of what you build and be an integral part of the product development process
  • Technical and financial growth!
Read more
Dataweave Pvt Ltd
at Dataweave Pvt Ltd
32 recruiters
Megha M
Posted by Megha M
Bengaluru (Bangalore)
7 - 10 yrs
Best in industry
Delivery Management
skill iconData Analytics
Agile/Scrum
Project delivery

Job Description: Project Manager

Internal Role: Delivery Manager     

Type: Full time

Location: Bangalore  

Internal Role: Delivery Manager

About Dataweave:      

About Us         

DataWeave provides “Competitive Intelligence as a Service” to Retailers and Consumer brands helping them optimize their offerings through effective pricing, assortment and promotional recommendations.

It's hard to tell what we love more, problems or solutions! Every day, we choose to address some of the hardest data problems that there are. We are in the business of making sense of messy public data on the Web. At serious scale!

Read more on http://dataweave.com/about/become-dataweaver">Become a DataWeaver

 

Job Description:

  • Develop a detailed project plan (Ensuring resource availability and allocation) to track progress
  • Developing project scopes and objectives, involving all relevant stakeholders and ensuring technical feasibility
  • Ensure that all projects are delivered on-time, within scope and within budget
  • Coordinate internal resources and third parties/vendors for the flawless execution of projects
  • Use appropriate verification techniques to manage risks, changes in project scope, schedule and costs
  • Measure project performance using appropriate systems, tools and techniques
  • Report and escalate to management as needed
  • Establish and maintain relationships with Clients, internal stakeholders, third parties / vendors
  • Create and maintain comprehensive project documentation
  • Definition of service level agreements (SLA) in relation to services, ensuring the SLA’s are achieved; data sanity and hygiene & client expectations are exceeded
  • Effectively monitor, control and support delivery, ensuring standard operating procedures and methodologies are followed
  • Create KPIs for monitoring and review and publish health stats on a recurring basis

 

 

Key Skills / Knowledge required:

  • Proven ability to switch context, manage multiple short projects
  • Relevant experience of 3-5 years and overall experience of 8+ yrs
  • Exposure to Analytics delivery process, ability to troubleshoot using insights within the data
  • Basic Database query skills, with knowledge of Web-crawling concepts is preferred
  • Excellent communication skills
  • Team management and ability to deliver while working with cross functional teams
  • Exposure to Agile Development (Scrum / Kanban) methodology is a plus

 

Read more
Syrencloud
at Syrencloud
3 recruiters
Samarth Patel
Posted by Samarth Patel
Hyderabad
3 - 7 yrs
₹5L - ₹8L / yr
skill iconData Analytics
Data analyst
SQL
SAP
Our growing technology firm is looking for an experienced Data Analyst who is able to turn project requirements into custom-formatted data reports. The ideal candidate for this position is able to do complete life cycle data generation and outline critical information for each Project Manager. We also need someone who is able to analyze business procedures and recommend specific types of data that can be used to improve upon them.
Read more
SpringML
at SpringML
1 video
4 recruiters
Sai Raj Sampath
Posted by Sai Raj Sampath
Remote, Hyderabad
4 - 9 yrs
₹12L - ₹20L / yr
Big Data
Data engineering
TensorFlow
Apache Spark
skill iconJava
+2 more
REQUIRED SKILLS:

• Total of 4+ years of experience in development, architecting/designing and implementing Software solutions for enterprises.

• Must have strong programming experience in either Python or Java/J2EE.

• Minimum of 4+ year’s experience working with various Cloud platforms preferably Google Cloud Platform.

• Experience in Architecting and Designing solutions leveraging Google Cloud products such as Cloud BigQuery, Cloud DataFlow, Cloud Pub/Sub, Cloud BigTable and Tensorflow will be highly preferred.

• Presentation skills with a high degree of comfort speaking with management and developers

• The ability to work in a fast-paced, work environment

• Excellent communication, listening, and influencing skills

RESPONSIBILITIES:

• Lead teams to implement and deliver software solutions for Enterprises by understanding their requirements.

• Communicate efficiently and document the Architectural/Design decisions to customer stakeholders/subject matter experts.

• Opportunity to learn new products quickly and rapidly comprehend new technical areas – technical/functional and apply detailed and critical thinking to customer solutions.

• Implementing and optimizing cloud solutions for customers.

• Migration of Workloads from on-prem/other public clouds to Google Cloud Platform.

• Provide solutions to team members for complex scenarios.

• Promote good design and programming practices with various teams and subject matter experts.

• Ability to work on any product on the Google cloud platform.

• Must be hands-on and be able to write code as required.

• Ability to lead junior engineers and conduct code reviews



QUALIFICATION:

• Minimum B.Tech/B.E Engineering graduate
Read more
LatentView Analytics
Bengaluru (Bangalore), Chennai
9 - 14 yrs
₹9L - ₹14L / yr
Data Structures
Business Development
skill iconData Analytics
Regression Testing
skill iconMachine Learning (ML)
+4 more
Required Skill Set: -5+ years of hands-on experience in delivering results-driven analytics solutions with proven business value - Great consulting and quantitative skills, detail-oriented approach, with proven expertise in developing solutions using SQL, R, Python or such tools - A background in Statistics / Econometrics / Applied Math / Operations Research would be considered a plus -Exposure to working with globally dispersed teams based out of India or other offshore locations Role Description/ Responsibilities: Be the face of LatentView in the client's organization and help define analytics-driven consulting solutions to business problems -Translate business problems into analytic solution requirements and work with the LatentView team to develop high-quality solutions "- Communicate effectively with client / offshore team to manage client expectations and ensure timeliness and quality of insights -Develop expertise in clients business and help translate that into increasingly high value-added advisory solutions to client -Oversee Project Delivery to ensure the team meets the quality, productivity and SLA objectives - Grow the Account in terms of revenue and the size of the team You should Apply if you want to: - Change the world with Math and Models: At the core, we believe that analytics can help drive business transformation and lasting competitive advantage. We work with a heavy mix of algorithms, analysis, large databases and ROI to positively transform many a client- business performance - Make a direct impact on business: Your contribution to delivering results-driven solutions can potentially lead to millions of dollars of additional revenue or profit for our clients - Thrive in a Fast-pace Environment: You work in small teams, in an entrepreneurial environment, and a meritorious culture that values speed, growth, diversity and contribution - Work with great people: Our selection process ensures that we hire only the very best, while more than 50% of our analysts and 90% of our managers are alumni/alumna of prestigious global institutions
Read more
Why apply to jobs via Cutshort
people_solving_puzzle
Personalized job matches
Stop wasting time. Get matched with jobs that meet your skills, aspirations and preferences.
people_verifying_people
Verified hiring teams
See actual hiring teams, find common social connections or connect with them directly. No 3rd party agencies here.
ai_chip
Move faster with AI
We use AI to get you faster responses, recommendations and unmatched user experience.
21,01,133
Matches delivered
37,12,187
Network size
15,000
Companies hiring
Did not find a job you were looking for?
icon
Search for relevant jobs from 10000+ companies such as Google, Amazon & Uber actively hiring on Cutshort.
companies logo
companies logo
companies logo
companies logo
companies logo
Get to hear about interesting companies hiring right now
Company logo
Company logo
Company logo
Company logo
Company logo
Linkedin iconFollow Cutshort
Users love Cutshort
Read about what our users have to say about finding their next opportunity on Cutshort.
Subodh Popalwar's profile image

Subodh Popalwar

Software Engineer, Memorres
For 2 years, I had trouble finding a company with good work culture and a role that will help me grow in my career. Soon after I started using Cutshort, I had access to information about the work culture, compensation and what each company was clearly offering.
Companies hiring on Cutshort
companies logos