Want to shape the future of Energy through Data Science? We have the data and if you have got the skills to unlock the patterns behind how a little change in one input parameter can have so much impact on the optimized Energy output parameters, like energy price. The Energy Exemplar (EE) data team is looking for an experienced Applied ML Data Scientist to join our Pune office. The data team is committed to helping EE customers keep a check on how heat rate, capacity expansion and daily unit commitment are subject to variations in demand, renewables, gas price, etc. There are lots of such use cases. By continuously gathering and analysing data, and by working with organizations inside and outside EE, the data team stays agile to combat evolving challenges. Our mission is to help advice customers and systems with industry-leading proactive optimal predictions, and engage in valuable partnerships. As a dedicated Data Scientist on our Research team, you will apply data science and your machine learning expertise to enhance our intelligent systems to predict and provide proactive advice. You’ll work with the team to identify and build features, create experiments, vet ML models, and ship successful models that provide value additions for hundreds of EE customers. At EE, you’ll have access to vast amounts of energy-related data from our sources. Our data pipelines are curated and supported by engineering teams (so you won't have to do much data engineering - you get to do the fun stuff.) We also offer many company-sponsored classes and conferences that focus on data science and ML. There’s great growth opportunity for data science at EE. Responsibilities Monitor and analyse data to uncover optimization gaps Develop high-performance algorithms, machine learning models, or other methodologies to close optimization gaps. Identify performant features and models and make them universally accessible to our teams across EE. Provide technical leadership to our team by reviewing problem sets, proposing prediction models, and reviewing experiments and models. Act as a resident expert for machine learning, statistics, and experiment design. Qualifications 5+ years of professional experience in experiment design and applied machine learning predicting outcomes in large-scale, complex datasets. Proficiency in Python, Azure ML, or other statistics/ML tools. Proficiency in Deep Neural Network, Python based frameworks. Proficiency in Azure DataBricks, Hive, Spark. Proficiency in deploying models into production (Azure stack). Moderate coding skills. SQL or similar required. C# or other languages strongly preferred. Outstanding communication and collaboration skills. You can learn from and teach others. Strong drive for results. You have a proven record of shepherding experiments to create successful shipping products/services. Experience with prediction in adversarial (energy) environments highly desirable. Understanding of the model development ecosystem across platforms, including development, distribution, and best practices, highly desirable. A Masters or Ph.D degree with coursework in Statistics, Data Science, Experimentation Design, and Machine Learning highly desirable
Job Title: Urgent opening Python Developer Job Location: Pune ( Baner) Experience: 2 to 6 years Annual CTC: 400000-1400000 Notice Period: Max 2 Months Skills: Python (Django, Flask), PHP(CI), MySQL, Strong focus on OOPS and Architecture, Bitbucket/GitHub, NoSQL Technical Requirements: Experience in developing web applications and APIs(REST, XML, other open sources) Strong programming foundation in Python, MYSQL & OOPS Experience in Django Experience & Good understanding of HTML5, CSS3, Bootstrap, Ajax, JS etc while experience on Angular, Node JS will be an added advantage. Solid exposure of API integrationsand familiar with various design & architectural patterns. In depth knowledge of Source Code Repository and experience working with Bit-bucket. Experience working on Apache HTTP or any other web/app server. Hands on experience in DB design, Architecture, coding, unit testing and debugging. Experience working in an Agile development environment. Sound in data structure analysis and algorithm design. Ensure cross-platform compatibility of information retrieved from web services on Android and iOS platforms, in terms of Push Notifications, platform-specific issues, etc Good knowledge of relational databases, version control tools and of developing web services. Strong understanding of the software development life cycle and best practices Roles and Responsibilities: Should be a problem solver with an attitude to contribute towards the success of Team/Project as well as organization. Should be able to guide other members in the team Should take initiatives to improve code quality standards and team efficiency. Should be able to Participate in the requirements gathering and come up efficient solutions Should be able to Efficiently estimate on high and low level along with assessing risk items
You will work on: Your primary work involves developing and maintaining tools for build, release, deployment, monitoring and operations both on cloud as well as on-premises infrastructure. You are required to work closely with Developers and Cloud Architects and own infrastructure automation, CI/CD processes and support operations. What you will do (Responsibilities): Day-to-day operational support of CI/CD infrastructure relied upon by teams deploying software to the cloud or on-premise Write Code to develop deployment of various services to private or public cloud/on-premise environments. Participate in cloud projects to implement new technology solutions, Proof of concepts to improve cloud technology offerings. Work with developers to deploy to private or public cloud/on-premise services, debug and resolve issues. On call responsibilities to respond to emergency situations and scheduled maintenance. Contribute to and maintain documentation for systems, processes, procedures and infrastructure configuration What you bring (Skills): Strong Linux System skills Scripting in bash, python Basic file handling & networking Comfortable in Git repositories specifically on GitHub, Gitlab, Bitbucket, Gerrit Comfortable in interfacing with SQL and No-SQL databases like MySQL, Postgres, MongoDB, ElasticSearch, Redis Great if you know (Skills): Understanding various build and CI/CD systems – Maven, Gradle, Jenkins, Gitlab CI, Spinnaker or Cloud based build systems Exposure to deploying and automating on any public cloud – GCP, Azure or AWS Private cloud experience – VMWare or OpenStack Big DataOps experience – managing infrastructure and processes for Apache Airflow, Beam, Hadoop clusters Containerized applications – Docker based image builds and maintainenace. Kubernetes applications – deploy and develop operators, helm charts, manifests among other artifacts. Advantage Cognologix: Higher degree of autonomy, startup culture & small teams Opportunities to become expert in emerging technologies Remote working options for the right maturity level Competitive salary & family benefits Performance based career advancement About Cognologix: Cognologix helps companies disrupt by reimagining their business models and innovate like a Startup. We are at the forefront of digital disruption and take a business first approach to help meet our client’s strategic goals. We are DevOps focused organization helping our clients focus on their core product activities by handling all aspects of their infrastructure, integration and delivery.
Want to shape the future of Energy through Data Science? We all know that without good data there is no Data Science. If you let garbage in, it will emit the garbage out. 60-70% efforts in a data science project are spent in Data Engineering & Feature Engineering. That's where we need your skills to fetch the data from disparate sources, transform it the way business needs (that may also include applying lots of critical business logics per its source and nature) and load it in a data warehouse / big data systems. These critical pieces of works complement the Data Scientist, with a continuous feedback loop based on how a model is performing and what fine tuning is needed in the data. The Energy Exemplar (EE) data team is looking for an experienced Data Engineer to join our Pune office. As a dedicated Data Engineer on our Research team, you will apply data engineering expertise, work very closely with the core data team to identify different data sources for specific energy markets and create an automated data pipeline. The pipeline will then incrementally pull the data from its sources and maintain a dataset, which in turn provides tremendous value to hundreds of EE customers. At EE, you’ll have access to vast amounts of energy-related data from our sources. Our data pipelines are curated and supported by engineering teams. We also offer many company-sponsored classes and conferences that focus on data science and ML. There’s great growth opportunity for data science at EE. Responsibilities Develop, test and maintain architectures, such as databases and large-scale processing systems using high-performance data pipeline. Recommend and implement ways to improve data reliability, efficiency, and quality. Identify performant features and make them universally accessible to our teams across EE. Work together with data analysts and data scientists to wrangle the data and provide quality datasets and insights to business critical decisions. Take end-to-end responsibility for the development, quality, testing, and production readiness of the services you build. Define and evangelize Data Engineering best standards and practices to ensure engineering excellence at every stage of development cycle. Act as a resident expert for data engineering, feature engineering, exploratory data analysis. Qualifications 2+ years of professional experience in developing data-pipelines for large-scale, complex datasets from varieties of data sources. Data Engineering expertise with strong experience working with Big data technologies such as Hadoop, Hive, Spark, Scala, Python etc. Experience working with Cloud based data technologies such as Azure Data lake, Azure Data factory, Azure Data Bricks highly desirable. Knowledge and experience working with database systems such as Cassandra, HBase, Cosmos etc. Moderate coding skills. SQL or similar required. C# or other languages strongly preferred. Proven track record of designing and delivering large-scale, high quality systems and software products. Outstanding communication and collaboration skills. You can learn from and teach others. Strong drive for results. You have a proven record of shepherding experiments to create successful shipping products/services. Experience with prediction in adversarial (energy) environments highly desirable. A Bachelor or Masters degree in Computer Science or Engineering with coursework in Statistics, Data Science, Experimentation Design, and Machine Learning highly desirable.
Looking for someone Experience working on Chatbot projects. Knowledge of Conversation platforms landscape Past experience with the Google DialogFlow tool - specifically the conversation design product Interpretation of business conversation flows and conversion of those flows into conversation configuration • Training multiple Chatbot models • Testing configured Chatbots • Understanding of deploying models to production • Good troubleshooting + problem-solving skills • Ability to Maintain growing sets of Chatbot Intents and DialogFlows
Job Brief Role and Responsibilities: Provide technical assistance to the customers by performing the following duties: Take control of and resolve complex technical and escalated customer problems Should have excellent communication & troubleshooting skills. Document and simulate complex customer issues to find solutions and fixes to customer inquiries and problems and dispatch additional service as necessary. Identify and provide inputs to Product and Engineering teams on unique and/or recurring customer problems. Ability to Collaborate, work alongside, and build mutually beneficial relationships with cross-functional teams (Customer Happiness, Customer Success, Sales, Product, Engineering). Perform routine maintenance of internal services as and when required. Ability to analyze, research, and solve technical and unique problems. Good in computer science fundamentals. Creative, independent, self-motivated, and willing to learn new technologies. Prepare accurate and timely reports. Skill and Requirements: Good understanding of QA methodologies and processes. Good command of Linux CLI tools - including system administration, data analysis and munging. Excellent command and understanding of databases like RDBMS, Mongo, and key-value stores like Redis. Proficiency in at least one scripting language python, node, etc. Have prior experience in testing back-end systems that involve interaction with multiple restful services. Must be able to think proactively; excellent follow-through and attention to detail. Knowledge of distributed applications/service-oriented applications is a plus. Good to have: Knowledge of Kibana/ElasticSearch Knowledge of Error Monitoring Tools like Sentry, StackDriver, etc. Knowledge of APM tools like New Relic, App Dynamics, Ruxit, etc. Knowledge of support tools like FreshDesk, Jira Helpdesk or Zendesk. Required Experience and qualifications: Should have an engineering / BSc / equivalent degree with 1 - 2 years of relevant work experience. Ability to work independently and efficiently to meet deadlines and SLAs. Ability to promptly answer & support-related tickets, chats, emails and phone calls.
Dear Candidate, This is Amit Sali, HR manager at Krish Compusoft (KCSIT Global). KCSIT Global is CMMI Level 3, ISO 27001 certified a cloud and data solutions company. It has international presence at US, UK, South Africa along with Development center in India in Ahmedabad & Pune. KCS partnering with Microsoft Gold partner, Google cloud partner, Amazon cloud partner as well as other OEMs We are urgently Hiring for Big Data Architect at Viman Nagar, Pune OR Ahmedabad, Gujarat for our company. Job Type- permanent Job Purpose: Looking for a Data Architect to design, implement, maintain big data platform on cloud and oversee process to ensure the secure data pipeline is implemented Overseeing development and implementation of data ingestion and processing Ensure data is provided in easily consumable form to business partners Providing technical leadership and support to provide security to the data Key Accountabilities: Understand company needs to define platform specifications Plan and design the architecture of the data platform Partner with business partners to understand the need for data and form in which data required Evaluate and select appropriate software or hardware and suggest integration methods Oversee assigned programs (e.g. conduct code review) and provide guidance to team members Assist with solving technical problems when they arise Ensure the implementation of agreed architecture and infrastructure Address technical concerns, ideas and suggestions Monitor systems to ensure they meet both, user needs and business goals Skills & Competencies: Proven experience as a Big Data Architect Strong Background in Big Data Infrastructure, Engineering and Development and working with Big Data and Hadoop File System Hands-on Experience with Hadoop Eco system (HDFS, SQOOP, Hive, PIG, Spark, Scala) Successful background as an architect on EDW/Data Lake projects Understanding of strategic IT solutions Experience in building Cloud native, container-based solutions
Company Profile:Flentas helps Startups, SMEs & Enterprises who want to leverage full potential of Cloud by making their journey to Cloud a successful one. As an organization, Flentas is focused on Cloud Consulting, IoT, DevOps practice and implementation, Cloud Governance Automation, Load/Performance Testing and tuning of high Traffic / High Volume Cloud applications. Flentas serves clients globally of all shapes and sizes with a strong and passionate team of experienced Solution Architects and Technology Enthusiasts. Job Brief: We are looking for candidates that have experience in development and have performed CI/CD based projects. Should have a good hands-on Jenkins Master-Slave architecture, used AWS native services like CodeCommit, CodeBuild, CodeDeploy and CodePipeline. Should have experience in setting up cross platform CI/CD pipelines which can be across different cloud platforms or onpremise and cloud platform. Job Location: Satara Road, Pune. Job Description: • Hands on with AWS (Amazon Web Services) Cloud with DevOps services and CloudFormation. • Experience interacting with customer. • Excellent communication. • Hands-on in creating and managing Jenkins job, Groovy scripting. • Experience in setting up Cloud Agnostic and Cloud Native CI/CD Pipelines. • Experience in Maven. • Experience in scripting languages like Bash, Powershell, Python.• Experience in automation tools like Terraform, Ansible, Chef, Puppet.• Excellent troubleshooting skills. • Experience in Docker and Kuberneties with creating docker files.• Hands on with version control systems like GitHub, Gitlab, TFS, BitBucket, etc.
As a Senior Tech Lead: You will be part of a thought leadership team that will design and develop the leading cyber security solution that protects digital assets of corporations such as Apple & the US Federal Govt. This solution used by global Fortune 100 corporations will be massively scalable to secure their Global networks You will bring to the table: Domain: Networking and Network Security Primary Skills: Java, Spring & Hibernate Secondary Skills: Any one of Python / Java Script / Angular JS / Shell / ANTLR / Groovy Expertise - Excellent skills & experience in Java, Spring & Hibernate - Minimum 2 years of Experience in Networking and Network Security domain - Any Scripting language - Python / Java Script / Angular JS / Shell / ANTLR / Groovy - Strong object-oriented design skills, data structures, algorithms, and design patterns. - Tools Pivotal / GitHub / Jenkins - Good to have Database design and management experience. What you will do… - You will be hands on, writing high quality code and ensuring on-time delivery. - Provide guidance on software design, architecture, and interface choices. - Design highly scalable, reliable, secure and fault tolerant systems with minimal guidance. - Mentor engineers on design, coding, and troubleshooting. - Analyse requirements, problems and solve them with the best solution. - Create platforms, reusable libraries, and utilities wherever applicable. - Work in cross-functional team, collaborating with peers during entire SDLC. - Work as part of a team to solve complex technical problems. - Support customer queries, escalations, to keep high customer satisfaction. About Benison Benison Tech is a niche technology company that has been appointed by Intel, Broadcom, CISCO, Checkpoint, and Marvell to collaboratively spearhead the next generation Network Security, 5G and Wireless technologies. We help our mutual customers get to market faster by applying our core technical brilliance in solving complex engineering problems. We work with the world leading technology companies in the latest bleeding edge technologies from 5G enablement to real-time ML based network security systems. Our interview process isn’t easy, but necessary to ensure that we are a fit for each other. You will be working in a dynamic fast paced environment on cutting edge technologies, so roll up your sleeves and get ready for the challenge. We need people who are drawn to technology challenges rather than work in a plush corporate role. You are a fit for Benison if You want to work in the technologies of the future… Network Security, Cloud technologies, 5G and WiFi6. You have a deep-rooted desire to learn new technologies. You are driven by the passion of solving complex problems. You want to work with some of the best minds in the industry
Job Title: Geekyworks - Sr Backend (Python and PHP) Developer Job Location: Pune ( Baner) Experience: 5 to 8 years Annual CTC: 700000-1200000 Notice Period: max 2 Month Skills: Python (Django, Flask), PHP(CI), MySQL, Strong focus on OOPS and Architecture, Bitbucket/GitHub, NoSQL Technical Requirements: Work with development teams and product managers to conceptualize software solutions Developing and maintaining applications using Python/Django/Flask and MySQL. Experience in PHP based framework would be a good plus. Participating in a team-oriented environment to develop complex web-based applications Adding new features to existing code data structure analysis and algorithm design, Solving complex performance problems and architectural challenges Experience in developing web applications and APIs (REST, XML, other open sources) Experience & Good understanding of HTML5, CSS3, Bootstrap, Ajax, JS etc while experience on Angular, Node JS will be an added advantage. In depth knowledge of Source Code Repository and experience working with Bit-bucket. Hands on experience in API Integration, DB design, Architecture/Design Patterns, Best coding practices and debugging. Experience working in an Agile development environment. Ensure cross-platform compatibility of information retrieved from web services on Android and iOS platforms, in terms of Push Notifications, platform-specific issues, etc Roles and Responsibilities: Should be a problem solver with an attitude to contribute towards the success of Team/Project as well as organization. Should be able to guide other members in the team Should take initiatives to improve code quality standards and team efficiency. Should be able to Participate in the requirements gathering and come up efficient solutions Should be able to do risk analysis and efficiently estimate at high and low levels
Must have 5-8 years of experience in handling data Must have the ability to interpret large amounts of data and to multi-task Must have strong knowledge of and experience with programming (Python), Linux/Bash scripting, databases(SQL, etc) Must have strong analytical and critical thinking to resolve business problems using data and tech Must have domain familiarity and interest of – Cloud technologies (GCP/Azure Microsoft/ AWS Amazon), open-source technologies, Enterprise technologies Must have the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Must have good communication skills Working knowledge/exposure to ElasticSearch, PostgreSQL, Athena, PrestoDB, Jupyter Notebook
Key Responsibilities Work with India and US managers to design end to end technology solutions in DWH/BI space Work with India manager to manage overall project delivery and lead project planning, system design & development, testing, UAT and deployment activities Work closely with 159 team and client's business and IT teams to gather project requirements Develop client relationships and serve as primary contact for all project related communications Build technical solutions using latest open source and cloud-based technologies like AWS Redshift, RDS, Glue, Apache Airflow etc. Build demos and POCs in support of business development for new and existing clients Lead creation of PowerPoint slides and online visualization (e.g. Tableau, Qlik, Sisense etc.) to communicate findings Work with India manager to build & grow a team of analyst & consultants with expertise in ETL, BI reporting, python and analytics support Mentor a team of 5 to 8 consultants/analysts ongoing basis Conduct training sessions to train analysts and help shape their growth