Responsibilities:
- Exploring and visualizing data to gain an understanding of it, then identifying differences in data distribution that could affect performance when deploying the model in the real world.
- Verifying data quality, and/or ensuring it via data cleaning.
- Able to adapt and work fast in producing the output which upgrades the decision making of stakeholders using ML.
- To design and develop Machine Learning systems and schemes.
- To perform statistical analysis and fine-tune models using test results.
- To train and retrain ML systems and models as and when necessary.
- To deploy ML models in production and maintain the cost of cloud infrastructure.
- To develop Machine Learning apps according to client and data scientist requirements.
- To analyze the problem-solving capabilities and use-cases of ML algorithms and rank them by how successful they are in meeting the objective.
Technical Knowledge:
- Worked with real time problems, solved them using ML and deep learning models deployed in real time and should have some awesome projects under his belt to showcase.
- Proficiency in Python and experience with working with Jupyter Framework, Google collab and cloud hosted notebooks such as AWS sagemaker, DataBricks etc.
- Proficiency in working with libraries Sklearn, Tensorflow, Open CV2, Pyspark, Pandas, Numpy and related libraries.
- Expert in visualising and manipulating complex datasets.
- Proficiency in working with visualisation libraries such as seaborn, plotly, matplotlib etc.
- Proficiency in Linear Algebra, statistics and probability required for Machine Learning.
- Proficiency in ML Based algorithms for example, Gradient boosting, stacked Machine learning, classification algorithms and deep learning algorithms. Need to have experience in hypertuning various models and comparing the results of algorithm performance.
- Big data Technologies such as Hadoop stack and Spark.
- Basic use of clouds (VM’s example EC2).
- Brownie points for Kubernetes and Task Queues.
- Strong written and verbal communications.
- Experience working in an Agile environment.
About SmartJoules
About
Energy efficiency is the cleanest, quickest and cheapest way to bring more than 300 million Indians out of energy poverty.
By eliminating waste in their own operations, building owners can save money, simplify operations, improve comfort and free up resources for the less fortunate. Smart Joules makes this process seamless and profitable from day one.
Company video
Connect with the team
Similar jobs
Company: PluginLive
About the company:
PluginLive Technology Pvt Ltd is a leading provider of innovative HR solutions. Our mission is to transform the hiring process through technology and make it easier for organizations to find, attract, and hire top talent. We are looking for a passionate and experienced Data Engineering Lead to guide the data strategy and engineering efforts for our Campus Hiring Digital Recruitment SaaS Platform.
Role Overview:
The Data Engineering Lead will be responsible for leading the data engineering team and driving the development of data infrastructure, pipelines, and analytics capabilities for our Campus Hiring Digital Recruitment SaaS Platform. This role requires a deep understanding of data engineering, big data technologies, and team leadership. The ideal candidate will have a strong technical background, excellent leadership skills, and a proven track record of building robust data systems.
Job Description
Position: Data Engineering Lead - Campus Hiring Digital Recruitment SaaS Platform
Location: Chennai
Minimum Qualification: Bachelor’s degree in computer science, Engineering, Data Science, or a related field. Master’s degree or equivalent is a plus.
Experience: 7+ years of experience in data engineering, with at least 3 years in a leadership role.
CTC: 20-30 LPA
Employment Type: Full Time
Key Responsibilities:
Data Strategy and Vision:
- Develop and communicate a clear data strategy and vision for the Campus Hiring Digital Recruitment SaaS Platform.
- Conduct market research and competitive analysis to identify trends, opportunities, and data needs.
- Define and prioritize the data roadmap, aligning it with business goals and customer requirements.
Data Infrastructure Development:
- Design, build, and maintain scalable data infrastructure and pipelines to support data collection, storage, processing, and analysis.
- Ensure the reliability, scalability, and performance of the data infrastructure.
- Implement best practices in data management, including data governance, data quality, and data security.
Data Pipeline Management:
- Oversee the development and maintenance of ETL (Extract, Transform, Load) processes.
- Ensure data is accurately and efficiently processed and available for analytics and reporting.
- Monitor and optimize data pipelines for performance and cost efficiency.
Data Analytics and Reporting:
- Collaborate with data analysts and data scientists to build and deploy advanced analytics and machine learning models.
- Develop and maintain data models, dashboards, and reports to provide insights and support decision-making.
- Ensure data is easily accessible and usable by stakeholders across the organization.
Team Leadership:
- Lead, mentor, and guide a team of data engineers, fostering a culture of collaboration, continuous improvement, and innovation.
- Conduct code reviews, provide constructive feedback, and ensure adherence to development standards.
- Collaborate with cross-functional teams including product, engineering, and marketing to ensure alignment and delivery of data goals.
Stakeholder Collaboration:
- Work closely with stakeholders to understand business requirements and translate them into technical specifications.
- Communicate effectively with non-technical stakeholders to explain data concepts and progress.
- Participate in strategic planning and decision-making processes.
Skills Required:
- Proven experience in designing and building scalable data infrastructures and pipelines.
- Strong proficiency in programming languages such as Python, R, Data visualization tools like Power BI, Tableau, Qlik, Google Analytics
- Expertise in big data technologies such as Apache Airflow, Hadoop, Spark, Kafka, and cloud data platforms like AWS, Oracle Cloud.
- Solid understanding of database technologies, both SQL and NoSQL.
- Experience with data modeling, data warehousing, and ETL processes.
- Strong analytical and problem-solving abilities.
- Excellent communication, collaboration, and leadership skills.
Preferred Qualifications:
- Experience in HR technology or recruitment platforms.
- Familiarity with machine learning and AI technologies.
- Knowledge of data governance and data security best practices.
- Contributions to open-source projects or active participation in the tech community.
- 8+Years of Exp in C++ Developement
Software: C++, Jenkins, Visual Studio, Linux, CheckMarx, GitHub, Google test framework.
Application Architecture: Scalable, Resilient, Cloud deployable, high performance based.
DB: Oracle
Libraries: CPLEX (knowledge of CPLEX is a plus)
2. Knowledge of different areas like UX / UI, Application / Infrastructure Architecture, Payment Gateways, Support etc. to build best in class product.
3. Oversee the architecture of a product to attain maximum efficiency
4. Conduct code reviews and specification conformance testing.
5. Establish an application deployment process.
6. Identify new technology trends and keep an eye on consumer’s evolving behavior on tech
7. Server side administration and Mobile app maintenance
8. Ability to design and architect new features in existing product
9. Plan infrastructure on cloud for future requirements
10. Testing and QA monitoring
11. Day to day management of team on work products, review effort estimates provided for the requirements
12. Ensure timely delivery based on discussed project pla
> Fresher or experienced can apply
> work from office ready
> salary upto rs. 3 lpa (purely based on interview)
> face to face interview mode.
Product-based Experience
TDD/BDD Experience
SQL Databases
Good to Have
Experience with Third-party Integrations
Queueing Systems
NoSQL Databases
Understanding of Rails Internals
Application Monitoring and Error Reporting Tools
Hey
We are looking for freelance iOS Developers for our new startup.
If you believe you are a developer who can follow all best practices and create quality apps, only then apply for this job.
We can allow you to work on your desired time frames each week, but we are strict about deadlines.
We expect you to be good at
- Managing rest services
- Managing cache and permanent storage
- Preparing pixel perfect UI which is compatible with all screen sizes (designs will be given)
- Daily pushing your code using Git
- Writing clean and understandable code
All the best!