Want to shape the future of Energy through Data Science? We all know that without good data there is no Data Science. If you let garbage in, it will emit the garbage out. 60-70% efforts in a data science project are spent in Data Engineering & Feature Engineering. That's where we need your skills to fetch the data from disparate sources, transform it the way business needs (that may also include applying lots of critical business logics per its source and nature) and load it in a data warehouse / big data systems. These critical pieces of works complement the Data Scientist, with a continuous feedback loop based on how a model is performing and what fine tuning is needed in the data. The Energy Exemplar (EE) data team is looking for an experienced Data Engineer to join our Pune office. As a dedicated Data Engineer on our Research team, you will apply data engineering expertise, work very closely with the core data team to identify different data sources for specific energy markets and create an automated data pipeline. The pipeline will then incrementally pull the data from its sources and maintain a dataset, which in turn provides tremendous value to hundreds of EE customers. At EE, you’ll have access to vast amounts of energy-related data from our sources. Our data pipelines are curated and supported by engineering teams. We also offer many company-sponsored classes and conferences that focus on data science and ML. There’s great growth opportunity for data science at EE. Responsibilities Develop, test and maintain architectures, such as databases and large-scale processing systems using high-performance data pipeline. Recommend and implement ways to improve data reliability, efficiency, and quality. Identify performant features and make them universally accessible to our teams across EE. Work together with data analysts and data scientists to wrangle the data and provide quality datasets and insights to business critical decisions. Take end-to-end responsibility for the development, quality, testing, and production readiness of the services you build. Define and evangelize Data Engineering best standards and practices to ensure engineering excellence at every stage of development cycle. Act as a resident expert for data engineering, feature engineering, exploratory data analysis. Qualifications 2+ years of professional experience in developing data-pipelines for large-scale, complex datasets from varieties of data sources. Data Engineering expertise with strong experience working with Big data technologies such as Hadoop, Hive, Spark, Scala, Python etc. Experience working with Cloud based data technologies such as Azure Data lake, Azure Data factory, Azure Data Bricks highly desirable. Knowledge and experience working with database systems such as Cassandra, HBase, Cosmos etc. Moderate coding skills. SQL or similar required. C# or other languages strongly preferred. Proven track record of designing and delivering large-scale, high quality systems and software products. Outstanding communication and collaboration skills. You can learn from and teach others. Strong drive for results. You have a proven record of shepherding experiments to create successful shipping products/services. Experience with prediction in adversarial (energy) environments highly desirable. A Bachelor or Masters degree in Computer Science or Engineering with coursework in Statistics, Data Science, Experimentation Design, and Machine Learning highly desirable.
About the Company Leap Info Systems Pvt. Ltd. - A software product company with its products and solutions in the convergent lighting controls and automation. Recently Leap acquired elitedali – worlds first Niagara based lighting controls and automation solution, from one of the leading US organization. We are a passionate team on a mission to develop innovative controls and automation products. We are expanding our product development team and are in search of like-minded highly passionate team members who would like to contribute to the leading lighting controls and automation product. We have customers in India, Europe, USA, and Australia. Eligibility Any suitable graduates (0 to 2 years experienced) who meet prescribed qualities and job responsibilities. Preferred but not limited to engineering graduate in Computer/IT/Instrumentation/EnTC etc.Job Responsibilities Be a part of the product development team for maintaining as well as developing new products based on JAVA and Niagara Software framework. Qualities Self-Learner Analytical skills Able to work with Cross-functional team Problem-solving approach Good Team Player with Positive vibes What you must have, Hands-on programming at academic or at the hobby level. Good knowledge of Java/J2EE related technologies Good knowledge of solution-based approaches Good communication skills - phone, email and in-person. What you can expect from LEAP, Positive conducive working environment to grow with cross-functional team members. Flexi working hours with defined responsibilities. Opportunity to work with leading open standard automation framework like Niagara Software, Lighting controls technologies like DALI, Wireless etc Be the active part of an emerging global product company
Work with developers to design algorithms and flowcharts Prepare GUI dummy screens for proposed Software development using Excel VBA. (To Give a overview how the software buttons and flow of information should happen) Coordination with Software Developer team to explain the criteria Produce clean, efficient code based on specifications Integrate software components and third-party programs Verify and deploy programs and systems Troubleshoot, debug and upgrade existing software Gather and evaluate user feedback Recommend and execute improvements Create technical documentation for reference and reporting Proven experience as a Software Developer, Software Engineer or similar role Familiarity with development methodologies Experience with software design and development in a test-driven environment Knowledge of coding languages (e.g. C#, C++) and frameworks/systems Ability to learn new languages and technologies Excellent communication skills Resourcefulness and troubleshooting aptitude Attention to detail Sound technical knowledge, thorough knowledge of all related codes and section details is desired. Thorough Knowledge of Design of Components of Residential / Commercial Structures is Desired. Accuracy In Following The Process & Jobs Is Required. Experience in Interaction with International Client Will Be Preferred
Description Deep experience and understanding of Apache Hadoop and surrounding technologies required; Experience with Spark, Impala, Hive, Flume, Parquet and MapReduce. Strong understanding of development languages to include: Java, Python, Scala, Shell Scripting Expertise in Apache Spark 2. x framework principals and usages. Should be proficient in developing Spark Batch and Streaming job in Python, Scala or Java. Should have proven experience in performance tuning of Spark applications both from application code and configuration perspective. Should be proficient in Kafka and integration with Spark. Should be proficient in Spark SQL and data warehousing techniques using Hive. Should be very proficient in Unix shell scripting and in operating on Linux. Should have knowledge about any cloud based infrastructure. Good experience in tuning Spark applications and performance improvements. Strong understanding of data profiling concepts and ability to operationalize analyses into design and development activities Experience with best practices of software development; Version control systems, automated builds, etc. Experienced in and able to lead the following phases of the Software Development Life Cycle on any project (feasibility planning, analysis, development, integration, test and implementation) Capable of working within the team or as an individual Experience to create technical documentation
We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions. Job Responsibilities: ◾Minimum 3 years of experience in Big Data skills required. ◾Complete life cycle experience with Big Data is highly preferred ◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala ◾Excellent communication skills ◾Ability to work independently with no-supervision.
Crest (Part of the Springer Nature group):-Headquartered in Pune, Crest is a Springer Nature company that delivers cutting edge IT and ITeS solutions to some of the biggest scientific content and database brands in the world. Our global teams work closely with our counterparts and clients in Europe, USA and New Zealand, leveraging the latest technology, marketing intelligence and subject matter expertise. With handpicked SME’s in a range of sciences and technology teams working on the latest ECM, Scala, SAP and MS Tech platforms, Crest not only develops quality STM content, but continuously enhances the channels though which they are delivered to the world. Crest is an ISO 9001 certified, driven by over 1000 professionals in Technology, Research& Analysis and Marketing & BPM. Specialties: 1. Technology 2. Research 3. Marketing Intelligence 4. Business Process Management