
Summary
Our Kafka developer has a combination of technical skills, communication skills and business knowledge. The developer should be able to work on multiple medium to large projects. The successful candidate will have excellent technical skills of Apache/Confluent Kafka, Enterprise Data WareHouse preferable GCP BigQuery or any equivalent Cloud EDW and also will be able to take oral and written business requirements and develop efficient code to meet set deliverables.
Must Have Skills
- Participate in the development, enhancement and maintenance of data applications both as an individual contributor and as a lead.
- Leading in the identification, isolation, resolution and communication of problems within the production environment.
- Leading developer and applying technical skills Apache/Confluent Kafka (Preferred) AWS Kinesis (Optional), Cloud Enterprise Data Warehouse Google BigQuery (Preferred) or AWS RedShift or SnowFlakes (Optional)
- Design recommending best approach suited for data movement from different sources to Cloud EDW using Apache/Confluent Kafka
- Performs independent functional and technical analysis for major projects supporting several corporate initiatives.
- Communicate and Work with IT partners and user community with various levels from Sr Management to detailed developer to business SME for project definition .
- Works on multiple platforms and multiple projects concurrently.
- Performs code and unit testing for complex scope modules, and projects
- Provide expertise and hands on experience working on Kafka connect using schema registry in a very high volume environment (~900 Million messages)
- Provide expertise in Kafka brokers, zookeepers, KSQL, KStream and Kafka Control center.
- Provide expertise and hands on experience working on AvroConverters, JsonConverters, and StringConverters.
- Provide expertise and hands on experience working on Kafka connectors such as MQ connectors, Elastic Search connectors, JDBC connectors, File stream connector, JMS source connectors, Tasks, Workers, converters, Transforms.
- Provide expertise and hands on experience on custom connectors using the Kafka core concepts and API.
- Working knowledge on Kafka Rest proxy.
- Ensure optimum performance, high availability and stability of solutions.
- Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices.
- Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Leverage Hadoop ecosystem knowledge to design, and develop capabilities to deliver our solutions using Spark, Scala, Python, Hive, Kafka and other things in the Hadoop ecosystem.
- Use automation tools like provisioning using Jenkins, Udeploy or relevant technologies
- Ability to perform data related benchmarking, performance analysis and tuning.
- Strong skills in In-memory applications, Database Design, Data Integration.

About DataMetica
About
Company video


Photos
Connect with the team
Similar jobs
Required Skills:
· 8+ years of being a practitioner in data engineering or a related field.
· Proficiency in programming skills in Python
· Experience with data processing frameworks like Apache Spark or Hadoop.
· Experience working on Databricks.
· Familiarity with cloud platforms (AWS, Azure) and their data services.
· Experience with data warehousing concepts and technologies.
· Experience with message queues and streaming platforms (e.g., Kafka).
· Excellent communication and collaboration skills.
· Ability to work independently and as part of a geographically distributed team.
Responsibilities:
- Designing and implementing fine-tuned production ready data/ML pipelines in Hadoop platform.
- Driving optimization, testing and tooling to improve quality.
- Reviewing and approving high level & amp; detailed design to ensure that the solution delivers to the business needs and aligns to the data & analytics architecture principles and roadmap.
- Understanding business requirements and solution design to develop and implement solutions that adhere to big data architectural guidelines and address business requirements.
- Following proper SDLC (Code review, sprint process).
- Identifying, designing, and implementing internal process improvements: automating manual processes, optimizing data delivery, etc.
- Building robust and scalable data infrastructure (both batch processing and real-time) to support needs from internal and external users.
- Understanding various data security standards and using secure data security tools to apply and adhere to the required data controls for user access in the Hadoop platform.
- Supporting and contributing to development guidelines and standards for data ingestion.
- Working with a data scientist and business analytics team to assist in data ingestion and data related technical issues.
- Designing and documenting the development & deployment flow.
Requirements:
- Experience in developing rest API services using one of the Scala frameworks.
- Ability to troubleshoot and optimize complex queries on the Spark platform
- Expert in building and optimizing ‘big data’ data/ML pipelines, architectures and data sets.
- Knowledge in modelling unstructured to structured data design.
- Experience in Big Data access and storage techniques.
- Experience in doing cost estimation based on the design and development.
- Excellent debugging skills for the technical stack mentioned above which even includes analyzing server logs and application logs.
- Highly organized, self-motivated, proactive, and ability to propose best design solutions.
- Good time management and multitasking skills to work to deadlines by working independently and as a part of a team.
Web Designer
Are you a creative mind with a passion for web design? Join our team at Bitcoding Solutions!
🔹 Role: Web Designer
🔹 Experience: 1.5 to 2 years
🔹 Skills:
- Proficiency in HTML, CSS, JavaScript
- Knowledge of jQuery
🔹 Responsibilities:
- Designing visually appealing and user-friendly websites
- Collaborating with cross-functional teams to implement design solutions
- Ensuring compatibility across different browsers and devices
- Staying updated on emerging web design trends
🔹 Location: Mota Varacha
If you're ready to showcase your skills and contribute to exciting projects, apply now!
ROLES AND RESPONSIBILITIES
● Understand the business and the potential to promote the business and brand.
● Analyzing market and sales information
● Identification of prospective clients.
● Preparing a go-to marketing strategy on digital, social media and print media
● Set up a Brand identification strategy and execution plan
● Research competitive products by identifying and evaluating product characteristics,
market share, pricing, and maintaining research databases.
● Tracks product line sales and costs by analyzing data and reviewing the competitors
reach
● Supports sales staff by pre-sales tasks, providing sales data, market trends, forecasts,
account analysis, new product information; relaying customer service requests
● Follow up with Clients through the CEO's contacts
● Website maintenance and press releases
● Marketing collateral – creation and maintenance and presentations
● E – campaign coordination
● Setting up tracking systems for marketing campaigns and online activities
● Reporting to management on marketing and sales activities
Must Haves:
● A marketing professional with a BBM/MBA in marketing
Must Haves:
● A marketing professional with a BBM/MBA in marketing.
● 4+ years of relevant experience.
● Academics from the city, where communication skills both written and spoken are
excellent.
● A dynamic professional who is very articulate, driven, and creative.
● Someone who understands the fundamentals of marketing (the 4P’s) as a foundation.
● The incumbent needs to be an expert in digital marketing that includes
a. Designing and conceptualizing visual aids – LinkedIn banners, videos, etc
b. Written skills to create thought leadership posts, blogs, etc
c. Content management to work around the products on digital mediums
d. An expert in LinkedIn and other digital tools
e. Google and derivatives
f. Website and traffic generation
g. Market research and campaigns
h. Ability to work independently as well as in a team environment
● Implement web or mobile interfaces using XHTML, CSS, and JavaScript
● Analyze and optimize UI and infrastructure application code for quality, efficiency, and performance
● Design & build the backend API servers that talk to the data infrastructure systems for fetching the data to be exposed via Arcana UI.
● Track data quality and latency, and set up monitors and alerts to ensure smooth operation
● Analyze and improve efficiency, scalability, and stability of various system resources
● Effectively communicate complex features and systems in detail.
● Establish self as an owner of a large scope component, feature or system with expert end-to-end understanding.
Position: Big Data Engineer
What You'll Do
Punchh is seeking to hire Big Data Engineer at either a senior or tech lead level. Reporting to the Director of Big Data, he/she will play a critical role in leading Punchh’s big data innovations. By leveraging prior industrial experience in big data, he/she will help create cutting-edge data and analytics products for Punchh’s business partners.
This role requires close collaborations with data, engineering, and product organizations. His/her job functions include
- Work with large data sets and implement sophisticated data pipelines with both structured and structured data.
- Collaborate with stakeholders to design scalable solutions.
- Manage and optimize our internal data pipeline that supports marketing, customer success and data science to name a few.
- A technical leader of Punchh’s big data platform that supports AI and BI products.
- Work with infra and operations team to monitor and optimize existing infrastructure
- Occasional business travels are required.
What You'll Need
- 5+ years of experience as a Big Data engineering professional, developing scalable big data solutions.
- Advanced degree in computer science, engineering or other related fields.
- Demonstrated strength in data modeling, data warehousing and SQL.
- Extensive knowledge with cloud technologies, e.g. AWS and Azure.
- Excellent software engineering background. High familiarity with software development life cycle. Familiarity with GitHub/Airflow.
- Advanced knowledge of big data technologies, such as programming language (Python, Java), relational (Postgres, mysql), NoSQL (Mongodb), Hadoop (EMR) and streaming (Kafka, Spark).
- Strong problem solving skills with demonstrated rigor in building and maintaining a complex data pipeline.
- Exceptional communication skills and ability to articulate a complex concept with thoughtful, actionable recommendations.
This is regarding a job opening in Peritus Infotech Solutions Pvt. Ltd. Noida.
Key Skills Required : PHP, Laravel, CodeIgnite, Mysql, Jquery, API (Restful Service), HTML, CSS.
Job Description:
1. Design, Develop, Implement, Test, Document and Maintain high quality Web Application Software.
2. Managing and delivery of projects from conceptualization, visualization to technology mapping and final execution of projects.
3. Understanding and implementing of project management practices.
4. MVC compliant Development (CodeIgniter / PHP / LARAVEL) in at least any one of these frameworks.
5. Design simple and intuitive user interfaces - the creation of examples through wireframes and mockups.
6. Optimisation of PHP Code and database queries and functions through Code Review and Project Review Sessions.
7. Identify opportunities for process improvement and make constructive suggestions for change.
8. Improve the technical competence of the team through training & development of existing and new team members.
9. Provide accurate effort estimates for deliverable and be committed to the deadlines through follow-up of tasks with team members.
10. Research and actively learn about new technologies and introduce them into the infrastructure.
11. Ability to handle technical queries raised by the development team and provide support and guidance to them.
Eligibility criteria :
1. Advanced PHP experience, specifically with PHP 5.
2. Strong database skills, proven experience with MySQL 5 and other RDBMS having knowledge of indexes, full-text searches, usage of Regular Expressions in queries and more.
3. Strong knowledge of OOPs concepts and OOAD using Design Patterns.
4. Excellent knowledge of Applications Architecture and how to work with Frameworks with MVC architecture.
5. Excellent Knowledge of HTML 5, CSS 3, Javascript, jQuery/Sencha etc.
6. Working Knowledge of Code Repository and Version Control Systems like SVN, GIT, CVS, Vault, Starteam, TFS, VSS or any other.
7. Strong Knowledge of any one of the Project Management tools like Basecamp, Redmine, Confluence etc.
8. Excellent problem-solving skills and love technical challenges.
9. An ideal candidate should have experience creating highly trafficked websites.
10.Ability to motivate staff in a team-oriented, collaborative environment.
11. Ability to manage priorities and work in a fast pace environment.
12. Excellent Communication skills.
UG :B.Sc in Any Specialization, B.Tech/B.E. in Any Specialization, BCA in Computers
PG :Any Postgraduate
Doctorate :Doctorate Not Required
Experience - 2.0 - 6.0 Year(s)
Location : Bangalore/Gurgaon
Qualification : Any Graduation
Candidate should either be from Tier 1 college/working in a Tier 1 product based company
Looking for Candidates having 2 to 6 Year(s) of experience in Frontend Development and atleast 1.5 to 2 years of experience in React.js
Individual contributor role
Reporting to Tech Lead Directly


















