11+ Text mining Jobs in Hyderabad | Text mining Job openings in Hyderabad
Apply to 11+ Text mining Jobs in Hyderabad on CutShort.io. Explore the latest Text mining Job opportunities across top companies like Google, Amazon & Adobe.





JOB TITLE - Product Development Engineer - Machine Learning
● Work Location: Hyderabad
● Full-time
Company Description
Phenom People is the leader in Talent Experience Marketing (TXM for short). We’re an early-stage startup on a mission to fundamentally transform how companies acquire talent. As a category creator, our goals are two-fold: to educate talent acquisition and HR leaders on the benefits of TXM and to help solve their recruiting pain points.
Job Responsibilities:
- Design and implement machine learning, information extraction, probabilistic matching algorithms and models
- Research and develop innovative, scalable and dynamic solutions to hard problems
- Work closely with Machine Learning Scientists (PhDs), ML engineers, data scientists and data engineers to address challenges head-on.
- Use the latest advances in NLP, data science and machine learning to enhance our products and create new experiences
- Scale machine learning algorithm that powers our platform to support our growing customer base and increasing data volume
- Be a valued contributor in shaping the future of our products and services
- You will be part of our Data Science & Algorithms team and collaborate with product management and other team members
- Be part of a fast pace, fun-focused, agile team
Job Requirement:
- 4+ years of industry experience
- Ph.D./MS/B.Tech in computer science, information systems, or similar technical field
- Strong mathematics, statistics, and data analytics
- Solid coding and engineering skills preferably in Machine Learning (not mandatory)
- Proficient in Java, Python, and Scala
- Industry experience building and productionizing end-to-end systems
- Knowledge of Information Extraction, NLP algorithms coupled with Deep Learning
- Experience with data processing and storage frameworks like Hadoop, Spark, Kafka etc.
Position Summary
We’re looking for a Machine Learning Engineer to join our team of Phenom. We are expecting the below points to full fill this role.
- Capable of building accurate machine learning models is the main goal of a machine learning engineer
- Linear Algebra, Applied Statistics and Probability
- Building Data Models
- Strong knowledge of NLP
- Good understanding of multithreaded and object-oriented software development
- Mathematics, Mathematics and Mathematics
- Collaborate with Data Engineers to prepare data models required for machine learning models
- Collaborate with other product team members to apply state-of-the-art Ai methods that include dialogue systems, natural language processing, information retrieval and recommendation systems
- Build large-scale software systems and numerical computation topics
- Use predictive analytics and data mining to solve complex problems and drive business decisions
- Should be able to design the accurate ML end-to-end architecture including the data flows, algorithm scalability, and applicability
- Tackle situations where problem is unknown and the Solution is unknown
- Solve analytical problems, and effectively communicate methodologies and results to the customers
- Adept at translating business needs into technical requirements and translating data into actionable insights
- Work closely with internal stakeholders such as business teams, product managers, engineering teams, and customer success teams.
Benefits
- Competitive salary for a startup
- Gain experience rapidly
- Work directly with the executive team
- Fast-paced work environment
About Phenom People
At PhenomPeople, we believe candidates (Job seekers) are consumers. That’s why we’re bringing e-commerce experience to the job search, with a view to convert candidates into applicants. The Intelligent Career Site™ platform delivers the most relevant and personalized job search yet, with a career site optimized for mobile and desktop interfaces designed to integrate with any ATS, tailored content selection like Glassdoor reviews, YouTube videos and LinkedIn connections based on candidate search habits and an integrated real-time recruiting analytics dashboard.
Use Company career sites to reach candidates and encourage them to convert. The Intelligent Career Site™ offers a single platform to serve candidates a modern e-commerce experience from anywhere on the globe and on any device.
We track every visitor that comes to the Company career site. Through fingerprinting technology, candidates are tracked from the first visit and served jobs and content based on their location, click-stream, behavior on site, browser and device to give each visitor the most relevant experience.
Like consumers, candidates research companies and read reviews before they apply for a job. Through our understanding of the candidate journey, we are able to personalize their experience and deliver relevant content from sources such as corporate career sites, Glassdoor, YouTube and LinkedIn.
We give you clear visibility into the Company's candidate pipeline. By tracking up to 450 data points, we build profiles for every career site visitor based on their site visit behavior, social footprint and any other relevant data available on the open web.
Gain a better understanding of Company’s recruiting spending and where candidates convert or drop off from Company’s career site. The real-time analytics dashboard offers companies actionable insights on optimizing source spending and the candidate experience.
Kindly explore about the company phenom (https://www.phenom.com/">https://www.phenom.com/)
Youtube - https://www.youtube.com/c/PhenomPeople">https://www.youtube.com/c/PhenomPeople
LinkedIn - https://www.linkedin.com/company/phenompeople/">https://www.linkedin.com/company/phenompeople/
https://www.phenom.com/">Phenom | Talent Experience Management
The Salesforce Business Analyst serves as the bridge between business needs and Salesforce solutions, ensuring that the platform aligns with organizational goals and drives efficiency. This role involves gathering and analyzing business requirements, optimizing Salesforce processes, and collaborating with stakeholders to enhance system functionality.
Essential Duties and Responsibilities:
User Management and Support
- Create, manage, and deactivate user accounts, roles, and permissions.
- Manage system permissions and access levels to protect sensitive information.
- Develop and maintain reports and dashboards as needed.
- Provide ongoing support to sales teams, including troubleshooting issues and answering user questions.
- Collaborate with IT to monitor system usage and user activity to ensure compliance and data integrity.
System Maintenance, Upgrades & Testing
- Monitor Salesforce releases and assess impact on the sales organization.
- Contribute to the Salesforce product roadmap by identifying sales-specific needs and improvements.
- Develop test plans and test cases to validate Salesforce changes and enhancements.
- Conduct user acceptance testing (UAT) and troubleshoot issues before deployment.
- Ensure smooth deployment of new Salesforce features and updates.
Training and Documentation
- Develop training and best practices for sales teams on Salesforce functionality.
- Collaborate with Sales Enablement on training needs for “Selling Smarter” sessions.
- Create and maintain process documentation, configurations and user guides.
- Serve as the primary contact for Salesforce-related questions.
Cross-Functional Collaboration
- Collaborate with Commercial Operations Teams to align system capabilities with business goals.
- Participate in cross-functional projects, including system integrations and new functionality releases.
Requirement Gathering and Analysis
- Document and analyze business requirements related to sales processes and Salesforce functionality.
- Translate business needs into detailed user stories, use cases, and functional requirements.
Publicis Sapient Overview:
The Senior Associate People Senior Associate L1 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
.
Job Summary:
As Senior Associate L2 in Data Engineering, you will translate client requirements into technical design, and implement components for data engineering solution. Utilize deep understanding of data integration and big data design principles in creating custom solutions or implementing package solutions. You will independently drive design discussions to insure the necessary health of the overall solution
The role requires a hands-on technologist who has strong programming background like Java / Scala / Python, should have experience in Data Ingestion, Integration and data Wrangling, Computation, Analytics pipelines and exposure to Hadoop ecosystem components. You are also required to have hands-on knowledge on at least one of AWS, GCP, Azure cloud platforms.
Role & Responsibilities:
Your role is focused on Design, Development and delivery of solutions involving:
• Data Integration, Processing & Governance
• Data Storage and Computation Frameworks, Performance Optimizations
• Analytics & Visualizations
• Infrastructure & Cloud Computing
• Data Management Platforms
• Implement scalable architectural models for data processing and storage
• Build functionality for data ingestion from multiple heterogeneous sources in batch & real-time mode
• Build functionality for data analytics, search and aggregation
Experience Guidelines:
Mandatory Experience and Competencies:
# Competency
1.Overall 5+ years of IT experience with 3+ years in Data related technologies
2.Minimum 2.5 years of experience in Big Data technologies and working exposure in at least one cloud platform on related data services (AWS / Azure / GCP)
3.Hands-on experience with the Hadoop stack – HDFS, sqoop, kafka, Pulsar, NiFi, Spark, Spark Streaming, Flink, Storm, hive, oozie, airflow and other components required in building end to end data pipeline.
4.Strong experience in at least of the programming language Java, Scala, Python. Java preferable
5.Hands-on working knowledge of NoSQL and MPP data platforms like Hbase, MongoDb, Cassandra, AWS Redshift, Azure SQLDW, GCP BigQuery etc
6.Well-versed and working knowledge with data platform related services on at least 1 cloud platform, IAM and data security
Preferred Experience and Knowledge (Good to Have):
# Competency
1.Good knowledge of traditional ETL tools (Informatica, Talend, etc) and database technologies (Oracle, MySQL, SQL Server, Postgres) with hands on experience
2.Knowledge on data governance processes (security, lineage, catalog) and tools like Collibra, Alation etc
3.Knowledge on distributed messaging frameworks like ActiveMQ / RabbiMQ / Solace, search & indexing and Micro services architectures
4.Performance tuning and optimization of data pipelines
5.CI/CD – Infra provisioning on cloud, auto build & deployment pipelines, code quality
6.Cloud data specialty and other related Big data technology certifications
Personal Attributes:
• Strong written and verbal communication skills
• Articulation skills
• Good team player
• Self-starter who requires minimal oversight
• Ability to prioritize and manage multiple tasks
• Process orientation and the ability to define and set up processes

We are looking for an enthusiastic developer with a strong understanding of core Ruby
and Rails framework along with PostgreSQL database. Someone who is passionate
about coding and loves to work in an ongoing challenging environment. You will be part
of a talented software team. You have to consistently deliver in a fast paced
environment and should be more than willing to build software that people love to use.
Key Responsibilities
The individual role that you’ll play in our team:
● Developing large multi-tenant applications in Rails.
● Understanding Rails best practices and religiously introducing those to our
codebase.
● Knowledge on how to do effective Refactoring.
● Ability to write unit tests and following those practices religiously.
● Working closely with the Product managers and UX team.
● Helping QAs to write automated integration tests.
● Staying up-to-date with current and future Backend technologies and
architectures.
Read the ‘Skills and Experience’ section, it is not the usual yada yada, you’ll be
asked specific questions on these.
Skills and Experience
● Ruby on Rails architecture best practices
● Knowledge on the latest versions on ROR
● Strong OOP knowledge in Ruby.
● Asynchronous Networking in Ruby
● Designing RESTFul HTTP APIs using JSON-Schema or JSON API (jsonapi.org).
● Ability to architect and develop API only backend
● Experience in using ActiveRecordSerializer
● Understanding O-Auth2 or JWT (Json Web Token) authentication mechanisms.
● How to use RSpec
● Rails Security Best Practices
● PostgreSQL and Rails.
● SQL concepts like Joins, Relationships etc.
● Understanding DB Partition strategies.
● Knowledge about refactoring ActiveRecord Models (read this - “7 Patterns to
Refactor Fat ActiveRecord Models”).
● Understanding scaling strategies for a high-traffic Rails applications (2 million+
requests a day).
● Background Job processing using Redis and Sidekiq
● Experience in using Amazon Web Services (AWS) tools.
● Writing automated Deployment Scripts using Capistrano, Ansible etc.
● Sending emails in Rails
● Knowledge in Linux and Git is mandatory
Optional Skills
● Knowledge in using Chef or Puppet
● Ability to do basic DevOps like setting up a Linux server.
● Websocket communication in Rails 5.
● Node.js
● JRuby

US Based comany, developing electronic autonomous tractors.


Job Description:
- Design, implement and deliver custom solutions using the existing robotics framework.
- Debug issues, do root-cause analysis and apply fixes.
- Design and implement tools to facilitate application development and testing.
- Participate in architectural improvements.
- Work with team members in deployment and field testing.
Requirements:
- Bachelor Degree / Masters in Engineering (ECE or CSE preferred)
- Work experience of 10+ years in software programming.
- Proficiency in Python programming for Linux based systems.
- Full understanding of software engineering.
- Basic Knowledge of Robot Operating System(ROS) is a plus.
- Good understanding of the algorithms and control loops.
- Working knowledge of Git: creating, merging branches, cherry-picking commits, examining the diff between two hashes. Advanced Git usage is a plus.
- Knowledge of video streaming from edge devices is a plus.
- Thrive in a fast-paced environment and have the ability to own the project’s tasks end-to-end with minimum hand-holding
- Learn and adapt new technologies & skills. Work on projects independently with timely delivery & defect free approach.


Position: Full Stack Developer
Location: Hyderabad, India
Must have skills:
- Experience working with Typescript
- Experience working with Vue (including Vuex for state management, Vue cli etc.)
- Knowledge of one or more CSS preprocessors/JavaScript bundlers like SASS, Less, Webpack, Parcel, Rollup.
- integrating with Restful or other web services.
- Proficiency with Git
- Appreciation for clean and well documented code
- Thorough understanding of user experience and possibly even product strategy


Urgent Requirement for PHP CodeIgniter Developer
Exp : 3 to 5 Years
Location : Hyderabad
Skills Required : HTML,CSS,MYSQL,Core PHP, OOPS,MVC, API integrations
Frame works Knowledge Such as Codeignator,Yii
Knowledge On JavaScript, Jquery Customization, Basic knowledge Angualrjs
Cms Knowledge like Wordpress, Joomla
Need to have exprience on above skills
