11+ Dashboard Jobs in Chennai | Dashboard Job openings in Chennai
Apply to 11+ Dashboard Jobs in Chennai on CutShort.io. Explore the latest Dashboard Job opportunities across top companies like Google, Amazon & Adobe.
About Us:
PluginLive is an all-in-one tech platform that bridges the gap between all its stakeholders - Corporates, Institutes Students, and Assessment & Training Partners. This ecosystem helps Corporates in brand building/positioning with colleges and the student community to scale its human capital, at the same time increasing student placements for Institutes, and giving students a real time perspective of the corporate world to help upskill themselves into becoming more desirable candidates.
Role Overview:
Entry-level Data Engineer position focused on building and maintaining data pipelines while developing visualization skills. You'll work alongside senior engineers to support our data infrastructure and create meaningful insights through data visualization.
Responsibilities:
- Assist in building and maintaining ETL/ELT pipelines for data processing
- Write SQL queries to extract and analyze data from various sources
- Support data quality checks and basic data validation processes
- Create simple dashboards and reports using visualization tools
- Learn and work with Oracle Cloud services under guidance
- Use Python for basic data manipulation and cleaning tasks
- Document data processes and maintain data dictionaries
- Collaborate with team members to understand data requirements
- Participate in troubleshooting data issues with senior support
- Contribute to data migration tasks as needed
Qualifications:
Required:
- Bachelor's degree in Computer Science, Information Systems, or related field
- around 2 years of experience in data engineering or related field
- Strong SQL knowledge and database concepts
- Comfortable with Python programming
- Understanding of data structures and ETL concepts
- Problem-solving mindset and attention to detail
- Good communication skills
- Willingness to learn cloud technologies
Preferred:
- Exposure to Oracle Cloud or any cloud platform (AWS/GCP)
- Basic knowledge of data visualization tools (Tableau, Power BI, or Python libraries like Matplotlib)
- Experience with Pandas for data manipulation
- Understanding of data warehousing concepts
- Familiarity with version control (Git)
- Academic projects or internships involving data processing
Nice-to-Have:
- Knowledge of dbt, BigQuery, or Snowflake
- Exposure to big data concepts
- Experience with Jupyter notebooks
- Comfort with AI-assisted coding tools (Copilot, GPTs)
- Personal projects showcasing data work
What We Offer:
- Mentorship from senior data engineers
- Hands-on learning with modern data stack
- Access to paid AI tools and learning resources
- Clear growth path to mid-level engineer
- Direct impact on product and data strategy
- No unnecessary meetings — focused execution
- Strong engineering culture with continuous learning opportunities
The Impact you will create on the Job
Understand and handle deploying, troubleshooting issues with Compute, Networking, Storage, Database services on AWS & Azure.
IT experience in a team handling the cloud, infrastructure and Linux, Windows operating systems.
Working on different services in AWS & Azure Cloud Platform
In depth knowledge of a wide range of AWS & Azure services in Compute, Storage, Networking, Infrastructure as a code, Serverless computing, IAM, CI/CD pipelines
Possess a thorough understanding of Internet based technologies (DNS, Security, IP Routing, SSH, FTP, HTTP/HTTPS etc.)
· Design, develop, and implement AI/ML models and algorithms.
· Focus on building Proof of Concept (POC) applications to demonstrate the feasibility and value of AI solutions.
· Write clean, efficient, and well-documented code.
· Collaborate with data engineers to ensure data quality and availability for model training and evaluation.
· Work closely with senior team members to understand project requirements and contribute to technical solutions.
· Troubleshoot and debug AI/ML models and applications.
· Stay up-to-date with the latest advancements in AI/ML.
· Utilize machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) to develop and deploy models.
· Develop and deploy AI solutions on Google Cloud Platform (GCP).
· Implement data preprocessing and feature engineering techniques using libraries like Pandas and NumPy.
· Utilize Vertex AI for model training, deployment, and management.
· Integrate and leverage Google Gemini for specific AI functionalities.
Qualifications:
· Bachelor’s degree in computer science, Artificial Intelligence, or a related field.
· 3+ years of experience in developing and implementing AI/ML models.
· Strong programming skills in Python.
· Experience with machine learning frameworks such as TensorFlow, PyTorch, or Scikit-learn.
· Good understanding of machine learning concepts and techniques.
· Ability to work independently and as part of a team.
· Strong problem-solving skills.
· Good communication skills.
· Experience with Google Cloud Platform (GCP) is preferred.
· Familiarity with Vertex AI is a plus.
Years of Experience required: 4 years (Min)
Mandatory:-
· At least 3 years of experience in database or BI developer role
· Microsoft accreditation(s) preferred
· Proficient at design and development of ETL using MS SQL Integration Services (SSIS)
· Experience with common ETL Design Patterns
· Proficient at writing T-SQL, stored procedures and functions
· Proficient with analysis of data quality and data profiling
· Experience with writing technical documentation and communicating database design
· Experience working with an Agile team to create and develop operational processes and software
· Experience developing ETL of OLTP data for analytical/BI reporting; ability to perform as OLTP DBA is a plus
· Proficient in Excel BI Tools, XML, JSON and Version Control Systems (TFS experience is a plus)
· Experience in OLAP schema modeling for DWH
· Tableau desktop and Server experience is a plus.
Key Responsibilities:
- Make outbound calls to customers and explain products/services in Tamil
- Handle inbound customer queries politely and professionally
- Maintain call logs, update customer records, and follow up regularly
- Achieve daily/weekly targets for call volume and lead conversion
- Provide accurate information and resolve customer concerns
- Report call status and feedback to the team lead/manager
Requirements:
- Good communication skills in Tamil (additional languages a plus)
- Basic computer knowledge and typing skills
- Patience and customer-friendly attitude
- Previous experience in telecalling or customer service is an advantage
- Minimum qualification: 10th/12th/Any Degree
Role Description
Eligibility:
1. An ex-serviceman preferably with civil work experience in security/training.
2. Age- preferably
3. Preferably from Infantry.
4. If civilian – min 05 years work experience as training manager in the field of security.
Qualifications
- Training & Development and Employee Training skills
- Excellent Communication skills, both oral and written
- Strong Analytical Skills
- Experience in creating and delivering successful training programs
- Demonstrated ability to work well within a team as well as with individuals across all levels of the organization
- Keen attention to detail and demonstrated ability to prioritize and manage multiple tasks effectively
- Bachelor's degree in a relevant field such as Human Resources, Education, Organizational Behavior, or related field
- Additional training or certification in training and development is a plus
- Prior work experience in the security industry is preferred
Qualification: B.com / BBA / MBA / 1-2 years of Experience in sales and marketing
Skills: Excellent Communication and Writing skills in both English & Tamil, Spontaneous Mindset, Wide Knowledge about Sale & Marketing , Social Skills and Technical knowledge, Anger & Stress management skills.
Salary: As per Industrial standard
Experience: 1-2 years or more / Freshers who have done internship or training program on Business Development or sale & marketing oriented training programs
Title: Software Development Engineer Fullstack
Duration: Permanent / Fulltime
- Development and potentially maintenance of advanced software systems and their features.
- Interact with customers and partners to scope and estimate user stories from acceptance criteria.
- Produce highly testable code with an emphasis on quality and correctness.
- Work alongside other engineers to improve technology, while consistently applying established effective software development practices.
- Attend regular project meetings, report on progress and setbacks, and participate in both planning and retrospective exercises.
- Proficient in .NET Core /C# ,Strong in OOPs, MVC, Design patterns, SOLID principles, Web API, SQL Server.HTML5, CSS3, JavaScript/Type scripting and web development frameworks.
- Expertise in React, Nodejs,
- 3 years of SQL server development
- Database design SOAP, REST
We are looking for an outstanding Big Data Engineer with experience setting up and maintaining Data Warehouse and Data Lakes for an Organization. This role would closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.
Roles and Responsibilities:
- Develop and maintain scalable data pipelines and build out new integrations and processes required for optimal extraction, transformation, and loading of data from a wide variety of data sources using 'Big Data' technologies.
- Develop programs in Scala and Python as part of data cleaning and processing.
- Assemble large, complex data sets that meet functional / non-functional business requirements and fostering data-driven decision making across the organization.
- Responsible to design and develop distributed, high volume, high velocity multi-threaded event processing systems.
- Implement processes and systems to validate data, monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
- Perform root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Provide high operational excellence guaranteeing high availability and platform stability.
- Closely collaborate with the Data Science team and assist the team build and deploy machine learning and deep learning models on big data analytics platforms.
Skills:
- Experience with Big Data pipeline, Big Data analytics, Data warehousing.
- Experience with SQL/No-SQL, schema design and dimensional data modeling.
- Strong understanding of Hadoop Architecture, HDFS ecosystem and eexperience with Big Data technology stack such as HBase, Hadoop, Hive, MapReduce.
- Experience in designing systems that process structured as well as unstructured data at large scale.
- Experience in AWS/Spark/Java/Scala/Python development.
- Should have Strong skills in PySpark (Python & SPARK). Ability to create, manage and manipulate Spark Dataframes. Expertise in Spark query tuning and performance optimization.
- Experience in developing efficient software code/frameworks for multiple use cases leveraging Python and big data technologies.
- Prior exposure to streaming data sources such as Kafka.
- Should have knowledge on Shell Scripting and Python scripting.
- High proficiency in database skills (e.g., Complex SQL), for data preparation, cleaning, and data wrangling/munging, with the ability to write advanced queries and create stored procedures.
- Experience with NoSQL databases such as Cassandra / MongoDB.
- Solid experience in all phases of Software Development Lifecycle - plan, design, develop, test, release, maintain and support, decommission.
- Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development).
- Experience building and deploying applications on on-premise and cloud-based infrastructure.
- Having a good understanding of machine learning landscape and concepts.
Qualifications and Experience:
Engineering and post graduate candidates, preferably in Computer Science, from premier institutions with proven work experience as a Big Data Engineer or a similar role for 3-5 years.
Certifications:
Good to have at least one of the Certifications listed here:
AZ 900 - Azure Fundamentals
DP 200, DP 201, DP 203, AZ 204 - Data Engineering
AZ 400 - Devops Certification
Wekan Company is hiring React.Js Developer for Chennai
Experience: Overall 4 – 7 years, relevant experience in React.Js - 2 years
Must Have Skills: React, Redux, ES6 and above, HTML5, CSS3
Good to have: Typescript
- Strong verbal and written communication skills, analytical skills and the ability to learn quickly.
- Creative approach to problem-solving with the ability to focus on details while maintaining the big picture view.
- In-depth understanding of Javascript, The DOM and relevant concepts.
- Experience of working on RESTful Web services.
- Responsive web development using HTML5, CSS3, JavaScript.
- Work with cutting-edge frameworks and technologies while advancing your skills along the way
- Developing new user-facing features using React.js and Redux
- Building reusable components and front-end libraries for future use
- Knowledge of WebPack.
- Good experience with any of the testing framework like mocha, jest, jasmine.
- Experience of working in an Agile environment with experience of using tools such as Jira.
- Good communication skills.
- Degree in Computer Science, Engineering or a related subject.
Best Regards
David




