We are looking for a smart data engineer who has production level experience working on Big Data solutions and handling large volume of data. In specific, the following Big Data skills will help - Production experience with Avro Production experience with Kafka, Kafka Connect, and Confluent Schema Registry (managing Avro over Kafka) Experience with snowflake data warehouse (not mandatory but very, very nice to have) Ideal candidate would have both Python and Scala experience Experience with Spark a plus Experience with any of: databricks, EMR, Hudi would be good to have but not mandatory Some other non-negotiable requirements are - Good academics Excellent communication skills Ready and immediately available to join is preferred. Remote working is not an issue anymore. About Tech Prescient - We are a product development and technology service company working with customers to build awesome products. We work with customers to design and develop their product stack and hence, quality of work we produce is always premium. We are looking for equally motivated people to join our vibrant team and am sure we will make it a win-win situation.
KCSIT Global is CMMI Level 3, ISO 27001 certified a cloud and data solutions company. It has international presence at US, UK, South Africa along with Development center in India in Ahmedabad & Pune. KCS partnering with Microsoft Gold partner, Google cloud partner, Amazon cloud partner as well as other OEMs We are urgently Hiring for Big Data Architect at Viman Nagar, Pune and Ahmedabad, Gujarat for our company. Kindly send your updated profile if you’re interested. Job Type- permanent Company Website- https://www.kcsitglobal.com/ Job Purpose: Looking for a Big Data Architect to design, implement, maintain big data platform on cloud and oversee process to ensure the secure data pipeline is implemented Overseeing development and implementation of data ingestion and processing Ensure data is provided in easily consumable form to business partners Providing technical leadership and support to provide security to the data Key Accountabilities: Understand company needs to define platform specifications Plan and design the architecture of the data platform Partner with business partners to understand the need for data and form in which data required Evaluate and select appropriate software or hardware and suggest integration methods Oversee assigned programs (e.g. conduct code review) and provide guidance to team members Assist with solving technical problems when they arise Ensure the implementation of agreed architecture and infrastructure Address technical concerns, ideas and suggestions Monitor systems to ensure they meet both, user needs and business goals Skills & Competencies: Proven experience as a Big Data Architect Strong Background in Big Data Infrastructure, Engineering and Development and working with Big Data and Hadoop File System Hands-on Experience with Hadoop Eco system (HDFS, SQOOP, Hive, PIG, Spark, Scala) Successful background as an architect on EDW/Data Lake projects preferred. Understanding of strategic IT solutions Experience in building Cloud native, container-based solutions If you’re interested kindly share below detail with updated resume. If you are interested for this position please share your updated word resume with following details. Full Name: Total Big Data Experience: Total Data Architect Exp:- Current CTC: Expected CTC: Notice Period: Current Location:
About the Company Leap Info Systems Pvt. Ltd. - A software product company with its products and solutions in the convergent lighting controls and automation. Recently Leap acquired elitedali – worlds first Niagara based lighting controls and automation solution, from one of the leading US organization. We are a passionate team on a mission to develop innovative controls and automation products. We are expanding our product development team and are in search of like-minded highly passionate team members who would like to contribute to the leading lighting controls and automation product. We have customers in India, Europe, USA, and Australia. Eligibility Any suitable graduates (0 to 2 years experienced) who meet prescribed qualities and job responsibilities. Preferred but not limited to engineering graduate in Computer/IT/Instrumentation/EnTC etc.Job Responsibilities Be a part of the product development team for maintaining as well as developing new products based on JAVA and Niagara Software framework. Qualities Self-Learner Analytical skills Able to work with Cross-functional team Problem-solving approach Good Team Player with Positive vibes What you must have, Hands-on programming at academic or at the hobby level. Good knowledge of Java/J2EE related technologies Good knowledge of solution-based approaches Good communication skills - phone, email and in-person. What you can expect from LEAP, Positive conducive working environment to grow with cross-functional team members. Flexi working hours with defined responsibilities. Opportunity to work with leading open standard automation framework like Niagara Software, Lighting controls technologies like DALI, Wireless etc Be the active part of an emerging global product company
Work with developers to design algorithms and flowcharts Prepare GUI dummy screens for proposed Software development using Excel VBA. (To Give a overview how the software buttons and flow of information should happen) Coordination with Software Developer team to explain the criteria Produce clean, efficient code based on specifications Integrate software components and third-party programs Verify and deploy programs and systems Troubleshoot, debug and upgrade existing software Gather and evaluate user feedback Recommend and execute improvements Create technical documentation for reference and reporting Proven experience as a Software Developer, Software Engineer or similar role Familiarity with development methodologies Experience with software design and development in a test-driven environment Knowledge of coding languages (e.g. C#, C++) and frameworks/systems Ability to learn new languages and technologies Excellent communication skills Resourcefulness and troubleshooting aptitude Attention to detail Sound technical knowledge, thorough knowledge of all related codes and section details is desired. Thorough Knowledge of Design of Components of Residential / Commercial Structures is Desired. Accuracy In Following The Process & Jobs Is Required. Experience in Interaction with International Client Will Be Preferred
Description Deep experience and understanding of Apache Hadoop and surrounding technologies required; Experience with Spark, Impala, Hive, Flume, Parquet and MapReduce. Strong understanding of development languages to include: Java, Python, Scala, Shell Scripting Expertise in Apache Spark 2. x framework principals and usages. Should be proficient in developing Spark Batch and Streaming job in Python, Scala or Java. Should have proven experience in performance tuning of Spark applications both from application code and configuration perspective. Should be proficient in Kafka and integration with Spark. Should be proficient in Spark SQL and data warehousing techniques using Hive. Should be very proficient in Unix shell scripting and in operating on Linux. Should have knowledge about any cloud based infrastructure. Good experience in tuning Spark applications and performance improvements. Strong understanding of data profiling concepts and ability to operationalize analyses into design and development activities Experience with best practices of software development; Version control systems, automated builds, etc. Experienced in and able to lead the following phases of the Software Development Life Cycle on any project (feasibility planning, analysis, development, integration, test and implementation) Capable of working within the team or as an individual Experience to create technical documentation
We at InfoVision Labs, are passionate about technology and what our clients would like to get accomplished. We continuously strive to understand business challenges, changing competitive landscape and how the cutting edge technology can help position our client to the forefront of the competition.We are a fun loving team of Usability Experts and Software Engineers, focused on Mobile Technology, Responsive Web Solutions and Cloud Based Solutions. Job Responsibilities: ◾Minimum 3 years of experience in Big Data skills required. ◾Complete life cycle experience with Big Data is highly preferred ◾Skills – Hadoop, Spark, “R”, Hive, Pig, H-Base and Scala ◾Excellent communication skills ◾Ability to work independently with no-supervision.
Crest (Part of the Springer Nature group):-Headquartered in Pune, Crest is a Springer Nature company that delivers cutting edge IT and ITeS solutions to some of the biggest scientific content and database brands in the world. Our global teams work closely with our counterparts and clients in Europe, USA and New Zealand, leveraging the latest technology, marketing intelligence and subject matter expertise. With handpicked SME’s in a range of sciences and technology teams working on the latest ECM, Scala, SAP and MS Tech platforms, Crest not only develops quality STM content, but continuously enhances the channels though which they are delivered to the world. Crest is an ISO 9001 certified, driven by over 1000 professionals in Technology, Research& Analysis and Marketing & BPM. Specialties: 1. Technology 2. Research 3. Marketing Intelligence 4. Business Process Management