9+ Analytics Jobs in Hyderabad | Analytics Job openings in Hyderabad
Apply to 9+ Analytics Jobs in Hyderabad on CutShort.io. Explore the latest Analytics Job opportunities across top companies like Google, Amazon & Adobe.

Job Title: Digital Marketing Specialist – Email Marketing, SEO & LinkedIn Outreach
Experience: 4–6 Years
Location: [Insert Location / Remote / Hybrid]
Industry: IT Services | Custom Enterprise Applications | B2B Tech Marketing
Employment Type: Full-Time
About Us
We are a fast-growing IT services company focused on delivering custom enterprise application development, cloud-native platforms, and digital transformation solutions. We are targeting clients across SEA, India, Europe, GCC, and North America, and are building a global brand presence to support our expansion.
We are looking for a Digital Marketing Specialist with proven expertise in Email Marketing, SEO, and LinkedIn Outreach to support lead generation, online visibility, and global campaign execution.
Role Overview
As a Digital Marketing Specialist, you’ll lead email campaigns, SEO initiatives, and LinkedIn marketing efforts to engage enterprise decision-makers and generate qualified leads across key global regions. You'll collaborate closely with content, sales, and leadership teams to drive measurable marketing impact.
Key Responsibilities
Email Marketing
- Plan, create, and execute targeted email campaigns for lead generation, nurturing, and re-engagement across SEA, India, Europe, GCC, and North America.
- Develop personalized workflows based on buyer personas, industries, and intent signals.
- Continuously A/B test and optimize for open rates, CTR, and conversions.
Search Engine Optimization (SEO)
- Own and execute on-page and off-page SEO strategies to boost visibility across regional search engines.
- Conduct keyword research and content optimization for blog posts, landing pages, and service pages.
- Develop high-quality backlink strategies through guest posting, outreach, and digital PR.
- Monitor SEO performance using tools like Google Search Console, Ahrefs, SEMrush, or Moz.
LinkedIn Marketing & Outreach
- Design and manage LinkedIn outreach campaigns targeting enterprise decision-makers (CXOs, IT Heads, Digital Leaders).
- Build and manage prospect lists using LinkedIn Sales Navigator and engagement tools (e.g., Apollo, Lemlist, etc.).
- Collaborate with the sales team to coordinate messaging, follow-ups, and campaign timing.
- Manage company LinkedIn presence — schedule posts, engage with the network, and grow audience organically.
Analytics & Optimization
- Track and report campaign performance across all digital channels (email, LinkedIn, SEO).
- Generate insights and recommendations for continuous improvement.
- Maintain lead funnel hygiene and alignment with CRM (e.g., HubSpot, Zoho).
What You Bring
- 4–6 years of experience in B2B digital marketing, ideally within the IT services or enterprise tech space.
- Demonstrated success running international marketing campaigns across SEA, India, Europe, GCC, and North America.
- Hands-on expertise with email marketing tools (e.g., Mailchimp, HubSpot, Zoho Campaigns).
- Solid understanding of SEO and experience with SEO tools
- Proficient in LinkedIn Marketing and Outreach, including Sales Navigator and campaign tools.
- Strong communication and copywriting skills tailored to B2B enterprise audiences.
- Self-driven, detail-oriented, and able to manage multiple priorities in a fast-paced startup environment.
Nice to Have
- Experience with performance marketing (Google/LinkedIn Ads).
- Familiarity with automation tools (e.g., Lemlist, Apollo, HubSpot).
- Knowledge of HTML/CSS for email customisation.
- Exposure to content marketing and blog strategy for lead generation.
Why Join Us?
- Play a key role in expanding a global tech services brand.
- Drive measurable impact across multiple digital channels and markets.
- Work in a collaborative, innovation-driven environment with high visibility.
- Fast-track growth opportunities in a high-performance culture.
- Competitive compensation + performance-based bonuses.
Looking for technically skilled candidates with excellent interpersonal skills for the technical support position. Technical support officers troubleshoot technical issues, provide timely customer feedback, and support the roll-out of new applications, among other duties.
Moreover, technical support officers need to talk to customers directly, as well as create written documentation, requiring excellent written and verbal communication.
Responsibilities:
- Identifying hardware and software solutions.
- Troubleshooting technical issues.
- Diagnosing and repairing faults.
- Resolving network issues.
- Installing and configuring hardware and software.
- Speaking to customers to quickly get to the root of their problem.
- Providing timely and accurate customer feedback.
- Talking customers through a series of actions to resolve a problem.
- Following up with clients to ensure the problem is resolved.
- Replacing or repairing the necessary parts.
- Supporting the roll-out of new applications.
- Providing support in the form of procedural documentation.
- Managing multiple cases at one time.
- Testing and evaluating new technologies.
- Conducting electrical safety checks on equipment.
Requirements:
- Degree in computer science or information technology.
- Certification in Microsoft, Linux, or Cisco is advantageous.
- Prior experience in tech support, desktop support, or a similar role.
- Proficiency in Windows/Linux/Mac OS.
- Experience with remote desktop applications and help desk software.
- Attention to detail and good problem-solving skills.
- Excellent interpersonal skills.
- Good written and verbal communication.
Seeking a detail-oriented Market Research Analyst to conduct surveys and analyze customer preferences and statistical data. Your role will involve providing valuable insights to support customers in their decision-making processes related to product designs, pricing, and promotions. As a successful Market Research Analyst, you will have the ability to independently analyze qualitative data, identify trends, assess strategies, and evaluate competition, with the ultimate goal of enhancing competitiveness. Your responsibilities will include gathering and interpreting market research data, generating reports, and presenting findings to stakeholders. We are looking for a self-motivated professional with strong analytical skills and a deep understanding of market dynamics. Join our team and contribute to our company’s success by helping us make data-driven decisions to optimize our products and strategies.
- 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
- Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- 5+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
Responsibilities
- Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
- Create custom Operators for Kubernetes, Kubeflow
- Develop data ingestion processes and ETLs
- Assist in dev ops operations
- Design and Implement APIs
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
- 3+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd and streaming databases like druid
- Strong industry expertise with containerization technologies including kubernetes, docker-compose
- 2+ years of industry in experience in developing scalable data ingestion processes and ETLs
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- Experience with scripting languages. Python experience highly desirable.
- 2+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Demonstrated expertise of building cloud native applications
- Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
- Experience in API development using Swagger
- Strong expertise with containerization technologies including kubernetes, docker-compose
- Experience with cloud platform services such as AWS, Azure or GCP.
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
- Design and Implement Large scale data processing pipelines using Kafka, Fluentd and Druid
- Assist in dev ops operations
- Develop data ingestion processes and ETLs
- Design and Implement APIs
- Assist in dev ops operations
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
Job Description
We are looking for an experienced engineer with superb technical skills. Primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Skills
- Bachelors/Masters/Phd in CS or equivalent industry experience
- Demonstrated expertise of building and shipping cloud native applications
- 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
- Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- 5+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
Responsibilities
- Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
- Create custom Operators for Kubernetes, Kubeflow
- Develop data ingestion processes and ETLs
- Assist in dev ops operations
- Design and Implement APIs
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
We at Datametica Solutions Private Limited are looking for SQL Engineers who have a passion for cloud with knowledge of different on-premise and cloud Data implementation in the field of Big Data and Analytics including and not limiting to Teradata, Netezza, Exadata, Oracle, Cloudera, Hortonworks and alike.
Ideal candidates should have technical experience in migrations and the ability to help customers get value from Datametica's tools and accelerators.
Job Description
Experience : 4-10 years
Location : Pune
Mandatory Skills -
- Strong in ETL/SQL development
- Strong Data Warehousing skills
- Hands-on experience working with Unix/Linux
- Development experience in Enterprise Data warehouse projects
- Good to have experience working with Python, shell scripting
Opportunities -
- Selected candidates will be provided training opportunities on one or more of the following: Google Cloud, AWS, DevOps Tools, Big Data technologies like Hadoop, Pig, Hive, Spark, Sqoop, Flume and Kafka
- Would get chance to be part of the enterprise-grade implementation of Cloud and Big Data systems
- Will play an active role in setting up the Modern data platform based on Cloud and Big Data
- Would be part of teams with rich experience in various aspects of distributed systems and computing
About Us!
A global Leader in the Data Warehouse Migration and Modernization to the Cloud, we empower businesses by migrating their Data/Workload/ETL/Analytics to the Cloud by leveraging Automation.
We have expertise in transforming legacy Teradata, Oracle, Hadoop, Netezza, Vertica, Greenplum along with ETLs like Informatica, Datastage, AbInitio & others, to cloud-based data warehousing with other capabilities in data engineering, advanced analytics solutions, data management, data lake and cloud optimization.
Datametica is a key partner of the major cloud service providers - Google, Microsoft, Amazon, Snowflake.
We have our own products!
Eagle – Data warehouse Assessment & Migration Planning Product
Raven – Automated Workload Conversion Product
Pelican - Automated Data Validation Product, which helps automate and accelerate data migration to the cloud.
Why join us!
Datametica is a place to innovate, bring new ideas to live and learn new things. We believe in building a culture of innovation, growth and belonging. Our people and their dedication over these years are the key factors in achieving our success.
Benefits we Provide!
Working with Highly Technical and Passionate, mission-driven people
Subsidized Meals & Snacks
Flexible Schedule
Approachable leadership
Access to various learning tools and programs
Pet Friendly
Certification Reimbursement Policy
Check out more about us on our website below!
http://www.datametica.com/">www.datametica.com
We are looking for an experienced engineer with superb technical skills. You will primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions. The successful candidate will be curious, creative, ambitious, self motivated, flexible, and have a bias towards taking action. As part of the early engineering team, you will have a chance to make a measurable impact in future of Thinkdeeply as well as having a significant amount of responsibility.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Experience
12+ Years
Location
Hyderabad
Skills
Bachelors/Masters/Phd in CS or equivalent industry experience
10+ years of industry experience in java related frameworks such as Spring and/or Typesafe
Experience with scripting languages. Python experience highly desirable. 5+ Industry experience in python
Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
Demonstrated expertise of building and shipping cloud native applications
Experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka, ELK Stack, Fluentd
Experience in API development using Swagger
Strong expertise with containerization technologies including kubernetes, docker-compose
Experience with cloud platform services such as AWS, Azure or GCP.
Implementing automated testing platforms and unit tests
Proficient understanding of code versioning tools, such as Git
Familiarity with continuous integration, Jenkins
Responsibilities
Architect, Design and Implement Large scale data processing pipelines
Design and Implement APIs
Assist in dev ops operations
Identify performance bottlenecks and bugs, and devise solutions to these problems
Help maintain code quality, organization, and documentation
Communicate with stakeholders regarding various aspects of solution.
Mentor team members on best practices
Top MNC looking for candidates on Business Analytics(4-8 Years Experience).
Requirement :
- Experience in metric development & Business analytics
- High Data Skill Proficiency/Statistical Skills
- Tools: R, SQL, Python, Advanced Excel
- Good verbal/communication Skills
- Supply Chain domain knowledge
*Job Summary*
Duration: 6months contract based at Hyderabad
Availability: 1 week/Immediate
Qualification: Graduate/PG from Reputed University
*Key Skills*
R, SQL, Advanced Excel, Python
*Required Experience and Qualifications*
5 to 8 years of Business Analytics experience.




