About Blue bricks Pty Ltd
Similar jobs
Company: PluginLive
About the company:
PluginLive Technology Pvt Ltd is a leading provider of innovative HR solutions. Our mission is to transform the hiring process through technology and make it easier for organizations to find, attract, and hire top talent. We are looking for a passionate and experienced Data Engineering Lead to guide the data strategy and engineering efforts for our Campus Hiring Digital Recruitment SaaS Platform.
Role Overview:
The Data Engineering Lead will be responsible for leading the data engineering team and driving the development of data infrastructure, pipelines, and analytics capabilities for our Campus Hiring Digital Recruitment SaaS Platform. This role requires a deep understanding of data engineering, big data technologies, and team leadership. The ideal candidate will have a strong technical background, excellent leadership skills, and a proven track record of building robust data systems.
Job Description
Position: Data Engineering Lead - Campus Hiring Digital Recruitment SaaS Platform
Location: Chennai
Minimum Qualification: Bachelor’s degree in computer science, Engineering, Data Science, or a related field. Master’s degree or equivalent is a plus.
Experience: 7+ years of experience in data engineering, with at least 3 years in a leadership role.
CTC: 20-30 LPA
Employment Type: Full Time
Key Responsibilities:
Data Strategy and Vision:
- Develop and communicate a clear data strategy and vision for the Campus Hiring Digital Recruitment SaaS Platform.
- Conduct market research and competitive analysis to identify trends, opportunities, and data needs.
- Define and prioritize the data roadmap, aligning it with business goals and customer requirements.
Data Infrastructure Development:
- Design, build, and maintain scalable data infrastructure and pipelines to support data collection, storage, processing, and analysis.
- Ensure the reliability, scalability, and performance of the data infrastructure.
- Implement best practices in data management, including data governance, data quality, and data security.
Data Pipeline Management:
- Oversee the development and maintenance of ETL (Extract, Transform, Load) processes.
- Ensure data is accurately and efficiently processed and available for analytics and reporting.
- Monitor and optimize data pipelines for performance and cost efficiency.
Data Analytics and Reporting:
- Collaborate with data analysts and data scientists to build and deploy advanced analytics and machine learning models.
- Develop and maintain data models, dashboards, and reports to provide insights and support decision-making.
- Ensure data is easily accessible and usable by stakeholders across the organization.
Team Leadership:
- Lead, mentor, and guide a team of data engineers, fostering a culture of collaboration, continuous improvement, and innovation.
- Conduct code reviews, provide constructive feedback, and ensure adherence to development standards.
- Collaborate with cross-functional teams including product, engineering, and marketing to ensure alignment and delivery of data goals.
Stakeholder Collaboration:
- Work closely with stakeholders to understand business requirements and translate them into technical specifications.
- Communicate effectively with non-technical stakeholders to explain data concepts and progress.
- Participate in strategic planning and decision-making processes.
Skills Required:
- Proven experience in designing and building scalable data infrastructures and pipelines.
- Strong proficiency in programming languages such as Python, R, Data visualization tools like Power BI, Tableau, Qlik, Google Analytics
- Expertise in big data technologies such as Apache Airflow, Hadoop, Spark, Kafka, and cloud data platforms like AWS, Oracle Cloud.
- Solid understanding of database technologies, both SQL and NoSQL.
- Experience with data modeling, data warehousing, and ETL processes.
- Strong analytical and problem-solving abilities.
- Excellent communication, collaboration, and leadership skills.
Preferred Qualifications:
- Experience in HR technology or recruitment platforms.
- Familiarity with machine learning and AI technologies.
- Knowledge of data governance and data security best practices.
- Contributions to open-source projects or active participation in the tech community.
Requirements:
- Need strong expertise in Kubernetes and ELK
- experience in implementation/migration to Elastic on Kubernetes.
- Experience in working in large enterprise and working with various stakeholders in DataCenter, Network
Responsibilities
- Set up ECK Cluster, upgrade ECK operator
- Migrate existing projects (100+) from ELK to ECK
- Build automation in moving projects from ELK to ECK
- Perform regression testing and close any technical opens, TCM
- Define processes for new business landing
- Object-oriented JavaScript,
- jQuery,
- ES6,
- React JS (Versions 16.8 & below),
- Redux (Action Type, Action Creator & Reducers),
- Middleware(Redux-Thunk, Redux-Saga),
- React Hooks,
- Axios & Fetch library,
- Managing States & Props efficiently,
- React Forms,
- Validation (Schema & Non-Schema using any third-party library or custom),
- the candidate should be aware of the production build process,
- HTML5,
- knowledge of CSS,
- Rest API integration knowledge,
- Authentication(Token Based) & Authorization with Routes
- Building, Designing, Testing, and Deploying web apps.
- Thoroughly explore & understand the under-hood of products in the market, strengths & fault-lines of each.
- Drive from Conceptualisation to Deployment of analytics suite of products that are robust and can stand the test of time when our data size is multiplying month-on-month.
- Work closely with designers & product managers to build things that are normally regarded as holy grails in browser-based analytics applications.
- Actively participate in design and code reviews to build robust applications and prototypes.
- Interacting with team members, and customers to continuously look for better innovations for all teams across Trade Brains
Summary: - As a Product Manager, you will act as a conduit between Surgeons and our R&D team. You’ll work with Surgeons to understand their needs & desires and have the technical skills to work with engineers to deliver those requirements. The result of your work is a commercial product that enterprise users (Hospitals and Surgeons) love and value. You’ll explore & develop product ideas through scheduled conversations with customers, assess competitor's offerings & investigate open-source technologies. You will collaborate with UX, engineering, and leadership teams to assess and prioritize the best for product development. Your key role will be translating user requirements into executable work items that the development team can implement in the product. You will have the necessary detailed technical knowledge to explain feature requirements to developers and work with them to deliver these improvements for users. This role is suitable for someone who enjoys working with users, and technology, and delivering software products that end customers would love to use in their respective fields of work.
Core Competencies: -
▪ Excellent Communication – Oral and Written
▪ Collaborative Mindset ▪ Adaptability to dynamic business scenarios
▪ Analytical Thinking
▪ Trusted Partnership
▪ Execution Excellence
▪ Problem-Solving
▪ Networking and Relationship Building
Job Responsibilities: -
# Product Exploring:-
▪ Understanding the overall market and competitors
▪ Work closely with stakeholders (customers, business development, and the open-source community) to understand needs and priorities
▪ Translate market and stakeholder needs into product requirements and definitions
▪ Collaborate with UX and Eng. on feature ideas and prototypes to bring potential solutions to life.
#Product Development:-
▪ Collaborate with stakeholders to guide feature development and prioritization
▪ Translate user requirements into executable feature work items with engineering.
▪ Be the "voice of the customer" during the specification and development stages.
▪ Influencing and collaborating in open source while balancing commercial product development.
#Product Success:-
▪ Able to communicate the business value of product capabilities to sales.
▪ Act as a product champion externally and internally in support of sales channels.
▪ Develop, track, and report key metrics to drive future product development.
Skills and Experience Requirements:-
▪ Any Engineering Degree/MBA.
▪ Aptitude for technology and familiarity with basic SDLC.
▪ Excellent communication skills, both verbal and written.
▪ 4 - 8 years of software product management experience.
▪ Enthusiastic about new technology and its transformational abilities.
▪ Prior experience in the field of Ortho / Navigation is preferable.
▪ Exposure of New Product Launch.
Job Title: Lead – Product Engineering
About SalesPanda, a unit of Bizight Solutions Private limited
SalesPanda is a Sales & Marketing Automation Platform to help companies grow their business directly as well as through their channel partners. With features like Content Marketing, Social Media Marketing, Email Marketing, Lead Management, Website Activity Tracking, it acts as an integrated platform to manage sales & marketing efforts. We work predominantly with insurance and financial services companies to automate digital marketing for their channel partners.
SalesPanda is a product of Bizight Solutions a reputed B2B Digital Agency with decades of sales and marketing experience. The company has a decade long experience and works with premium clients like IBM, AWS, HDFC, Aditya Birla, MaxLife, ICICI Home Finance Ingram, Redington, and many more.
Experience Required: 8-12 years
Education: Full-time Graduation in relevant IT stream (BE in CS/B.Tech CS) is mandatory.
Job Profile
We are looking for young and energetic technical folks who are willing to take up the challenge of leading the technical roadmap and the architecture of a fast-growing SaaS product.
The current role is for “Lead– Product Engineering for SalesPanda” and it requires the candidate to technically lead the product (SalesPanda) architecture, infrastructure, and application both. The infrastructure is hosted on AWS and the application is on LAMP. The candidate needs to have a good understanding and preferably hands-on experience of:
- Design and development of LAMP-based SaaS and Mobile product
- Software development lifecycle including source code control, issue tracking, version management, unit, and integration testing
- Understanding of AWS Cloud deployment and technologies like Lambda etc
- Performance tuning of database and application code
- Queuing solutions like AWS SQS, RabbitMQ etc
- Web/Mobile Security and Vulnerability Testing
- Third-party API integration
- Enhancing code coverage and quality through automated frameworks and tools.
- GIT and Jira processes
Experience in working with BFSI clients will be an added advantage.
Roles and Responsibilities
The candidate is expected to perform the following
- Review and optimize product architecture
- Optimize the product scalability and performance – this would involve working with the development and testing team and implement innovative ways to scale.
- Improve Development/ QA processes
- Plan best practices and also work hands-on on new features to check viability before handing over to development teams – e.g. API for email integration, CRM integration, Social media APIs
- Study AWS infrastructure setup and plan optimization
- Review application security and plan/implement the roadmap with the security team.
Skills Required:
- Must have hands-on experience of the language core PHP and framework Laravel, MySQL
- Experience in building cloud (AWS) solutions
- Must have worked with distributed systems and understand caching, queue management, and message brokering.
- Should have good knowledge of stateless applications and has worked with the auto-scaled environment on the cloud
- Should understand serverless concepts e.g. AWS Lambda etc.
- Must be a hands-on techie who can lead the team from the front.
- Implement new tools and technologies to further improve product stability
- Understanding of Integration of Email, Instagram, WhatsApp, Facebook, and other APIs
- Familiarity with security technologies, processes, and concepts such as Authentication and Authorization, Static Code Analysis, penetration testing methods
- Experience with common application security issues and remediation
- An understanding of performance engineering especially SQL tuning Unit and Integration testing frameworks and tools
- Experience in using source code control tools like Git
Employment Type: Full Time, 5 days a week
Compensation : As per industry standards
Job Location: Nehru Place, New Delhi
Visit http://www.bizight.com/">www.bizight.com and http://www.salespanda.com/">www.salespanda.com to know more about us.
- Developing telemetry software to connect Junos devices to the cloud
- Fast prototyping and laying the SW foundation for product solutions
- Moving prototype solutions to a production cloud multitenant SaaS solution
- Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources
- Build analytics tools that utilize the data pipeline to provide significant insights into customer acquisition, operational efficiency and other key business performance metrics.
- Work with partners including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
- Work with data and analytics specialists to strive for greater functionality in our data systems.
Qualification and Desired Experiences
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- 5+ years experiences building data pipelines for data science-driven solutions
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Good team worker with excellent interpersonal skills written, verbal and presentation
- Create and maintain optimal data pipeline architecture,
- Assemble large, sophisticated data sets that meet functional / non-functional business requirements.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Experience with AWS, S3, Flink, Spark, Kafka, Elastic Search
- Previous work in a start-up environment
- 3+ years experiences building data pipelines for data science-driven solutions
- Master in Computer Science, Electrical Engineering, Statistics, Applied Math or equivalent fields with strong mathematical background
- We are looking for a candidate with 9+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Spark, Kafka, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, RDS, Redshift
- Experience with stream-processing systems: Storm, Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Strong hands-on coding skills (preferably in Python) processing large-scale data set and developing machine learning model
- Familiar with one or more machine learning or statistical modeling tools such as Numpy, ScikitLearn, MLlib, Tensorflow
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and find opportunities for improvement.
- Strong analytic skills related to working with unstructured datasets.
- Build processes supporting data transformation, data structures, metadata, dependency and workload management.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Proven understanding of message queuing, stream processing, and highly scalable ‘big data’ data stores.
- Strong project management and interpersonal skills.
- Experience supporting and working with multi-functional teams in a multidimensional environment.
JOB Description
- Minimum 5 years of development experience with any full stack framework (except Java based)
- Experience in designing effective data structures and algorithms
- Experience with sql and no-sql databases and designs
- In-depth understanding of javascript concepts and one front-end framework
- Experience with version control and project management tools
- Experience in developing and consuming REST web services.
- excellent oral and written communication (english + 1 local language)
- analytical and problem-solving skills
- familiarity with any cloud service provider - aws/digital ocean/azure
Roles and Responsibilities
- requirement gathering and proving
- feasibility analysis
- estimations & planning
- prepare and maintain relevant documentation
- coding, debugging, testing
- benchmarking and performance upkeep
- deployment & maintenance
Signs you would be a good fit
- you enjoy crafting elegant, well-tested solutions, not just delivering working code
- you have logical and well-researched opinions on existing and new technologies, and relish the learnings and challenges of working on different platforms and products
- you have an entrepreneurial mindset
- you are a quick learner with proven reasoning skills
- you explore beyond your resume
About Eloquent Studio Pvt Ltd
Team of 1-10 people
Located in Pune, Maharashtra, India
ears of Exp: 3-6+ Years
Skills: Scala, Python, Hive, Airflow, SparkLanguages: Java, Python, Shell Scripting
GCP: BigTable, DataProc, BigQuery, GCS, Pubsub
OR
AWS: Athena, Glue, EMR, S3, RedshiftMongoDB, MySQL, Kafka
Platforms: Cloudera / Hortonworks
AdTech domain experience is a plus.
Job Type - Full Time
We are changing the way people learn professional skills and looking for the ones who can create bite-sized course content (conversational) on complex topics like product management / marketing etc.
As a elearning course creator, you will spend some time researching on the topic, discuss different possible case studies and then come up with the script.
Each and every course we are doing is essentially a movie - entertaining yet puts the point across, i.e. focused on actionable learning.
CTC
ECTC
Notice Period
Current Location
Preferred Location
Job Description:
Overview
Analyzes, develops, designs, and maintains software for the organization’s products and systems. Performs system integration of software and hardware to maintain throughput and program consistency. Develops, validates, and tests: structures and user documentation. Work may be reviewed for accuracy and overall adequacy. Follows established processes and directions
ü 3 – 8years of overall industry experience with a bachelor’s degree in an appropriate engineering discipline.
ü 3+ years of experience in network protocol stack design and Android networking software components. (e.g. DHCP, TCP/IP, UDP, Socket Programming, Routing concept, Subnet Concept etc.)
ü WLAN device driver design & development skills for Qualcomm, Broadcom, Marvel or MediaTek Wi-Fi stack.
ü Solid understanding of Wi-Fi 802.11 principles and functionality. Must be able to speak to topics including Wi-Fi roaming logic, security protocols (WPA, WPA2..), etc.
ü Proficient in debugging and implementing Linux device drivers.
ü Working knowledge of development and debugging tools on Android OS, including Klocworks.
ü Experience with network protocol stack design and Android networking software components. (e.g. DHCP, TCP/IP, UDP, Socket Programming, Routing concept, Subnet Concept etc.)
ü Experience with bus interfaces (PCIe, USB, SDIO, etc.). This is to make sure issues can be debugged related to SDIO, USB and PCIe interface between Wi-Fi driver and firmware communication path.
ü Should very capable in handling Wireshark and omni peek wireless sniffer tools to capture wireless packets and analyze the network and device problems.