

About White Panda
About
White Panda as a platform offers a convenient content creation process to Digital Agencies, Brand-turned-Publishers, SAAS companies, and E-commerce platforms without the need for Micromanagement.
Being built with a model based on Language, Human Resources and Marketing, the company works with thousands of content creators to power content marketing for hundreds of leading brands. Some of our notable clients include Radisson, Apollo, Axis Bank, PNB Housing, three of the top 10 marketing agencies in India, and so on. The core team at White Panda is a diverse range of professionals who are alumni of institutions like IITs, BITS, and Goldman Sachs. The venture is funded by highly successful entrepreneurs, Tier-1 investors, and IIT Gandhinagar; and is also cash flow positive.
In the outset, the purpose of any content is to educate or entertain. We aim to have an impact on end consumers by elevating the quality of Education/Entertainment by partnering with 40 Lakh businesses globally.
Connect with the team
Similar jobs
Software Engineer – 3D, Geometry, CAD Applications
(Desktop / Cloud / Machine)
JOB RESPONSIBILITIES
• You will collaborate with a multidisciplinary engineering team to develop various
applications for the Additive Manufacturing process chain.
• You will be required to research and implement advanced algorithms and mathematical
models.
• You will optimize CPU performance and memory usage of applications.
• You will be involved in the design of the software architecture.
• You should be self-motivated and have strategic thinking abilities.
• The working environment and architecture primarily consists of C++ geometry processing
and simulation libraries, coupled with visualization frameworks.
ESSENTIAL RESPONSIBILITIES
• Writing and Documenting High-Quality Code for Additive Manufacturing applications
• Developing Algorithms for Performance Improvements
• Bug Fixing and Regression Testing
• Developing Test Cases
• Designing, Developing and Implementing Geometry Processing Libraries
• Leading a Team of Junior Software Engineers and Developers (This ability will be a plus)
REQUIRED
• Bachelors / Masters in Mechanical Engineering, Computer Science or related field with 3-
5 Years of Experience.
• Good Understanding of OOPs Concepts, Design Patterns
• Hands-On Experience in Developing Applications for The Geometry Domain
• Math Proficiency - Linear Algebra, Numerical Analysis, Computational Geometry
• Ability to Work with A Multi-Disciplinary Team of Engineers.
• Technologies (Some combination of these will be suitable)
o Strong C++, C# Skills
o Python
o OpenGL, WPF
o C#.Net, ASP.NET
o JavaScript, React, Node.js
o GPGPU, CUDA
o Full-stack cloud development
o Familiarity with Azure Dev Ops
BIG PLUS
• 3D Graphics Experience
• Machine Learning Experience
• Knowledge of Meshing and Mesh Topology
• Familiarity with 3D Printing in General and Metal 3D Printing in Particular
• Knowledge of Cloud technologies / Developing applications for the Cloud
We are hiring
Get in touch with us if you have the aptitude/inclination towards developing cutting-edge software
products related to 3D, CAD, Additive Manufacturing. These products incorporate machine
learning and other intelligent algorithms that we will deploy on the desktop and the cloud.
About Intech Additive Solutions
www.intechadditive.com
https://www.linkedin.com/company/intechadditive
Intech Additive Solutions Pvt. Ltd is the first Indian Original Equipment Manufacturer (OEM) to
develop and supply Metal 3D Printers based on Laser Powder Bed Fusion (LPBF) technology.
Intech Additive is a complete solutions provider in Metal Additive Manufacturing (AM) systems and
AM Software. With its software suite, Intech's Metal 3D Printers provide customers with a readyto-
print AM solution out-of-the-box coupled with local after-sales services.
Intech is among the few OEMs to globally integrate its iFusion SF1 and iFusion LF series of Metal
3D Printers with its in-house developed build processing software – AMBuilder and parameter
optimization software - AMOptoMet.
Global German/Japanese Machine tool major DMG MORI has invested in Intech and is on the
advisory board.
Our roadmap and plans
• We are building software products and machines that will work together with modular
configurations. In addition, we hope to expand the metal AM ecosystem by making
adoption easy.
• We are committed to delivering first-time-right solutions, reducing the cost-per-part to
traditional manufacturing methods and providing a quick ROI to our customers.
• We are invested in growing the metal AM ecosystem and expanding its adoption in India
and beyond, with competitively priced products without sacrificing quality.
• We maintain our strategic advantage with relentless innovation that generates Intellectual
Property. We have filed for patents for our essential inventions and have a healthy pipeline
of new inventions to grow our IP portfolio.
Experience Level: Overall 15 - 20 years and minimum 5 years as Product QA Leader
Mandatory Skills:
• Passion for technology with a commitment to quality
• Hands-on skills as regards Product/Platform QA Strategy, Automation Testing, Regression Testing
and Testing Methodologies
• QA Leadership, Mobile App (iOS/Android) Testing, Program Management, Agile / Scaled Agile / Lean
methodologies
• Product Release Test Management, Product Maintenance Test Management
• Understanding of Retail, E-commerce industry (good to have)
• Understanding and hands-on knowledge of Jira, Xray, Automation, Regression Tools
• Demonstrated experience in test planning for complex features/systems
• Ability to work independently and thrive while focusing on the details, such as spotting subtle
problems, raising issues, and taking ownership of the solution
• Demonstrated experience in QA project management over multiple products, simultaneously
• Excellent communication skills

Programming Languages: Perl, java. Perl programming with strong OOPs knowledge.
UI: HTML, JS
System: Linux must have – good knowledge and shell scripting experience.
Prior experience in infrastructure automation, monitoring will definitely help.
Description:
The person in this role:
- Will be involved in developing new monitoring scripts, enhancement & defect fixes on existing monitors
- Have to be on-call to support any incoming production/P1 internal issues which need urgent attention (team members are on call for a week and we have a weekly rotation policy within the team)
Preferred skills:
- Perl
- Shell scripting
- Unix
- Jenkins
Position: Big Data Engineer
What You'll Do
Punchh is seeking to hire Big Data Engineer at either a senior or tech lead level. Reporting to the Director of Big Data, he/she will play a critical role in leading Punchh’s big data innovations. By leveraging prior industrial experience in big data, he/she will help create cutting-edge data and analytics products for Punchh’s business partners.
This role requires close collaborations with data, engineering, and product organizations. His/her job functions include
- Work with large data sets and implement sophisticated data pipelines with both structured and structured data.
- Collaborate with stakeholders to design scalable solutions.
- Manage and optimize our internal data pipeline that supports marketing, customer success and data science to name a few.
- A technical leader of Punchh’s big data platform that supports AI and BI products.
- Work with infra and operations team to monitor and optimize existing infrastructure
- Occasional business travels are required.
What You'll Need
- 5+ years of experience as a Big Data engineering professional, developing scalable big data solutions.
- Advanced degree in computer science, engineering or other related fields.
- Demonstrated strength in data modeling, data warehousing and SQL.
- Extensive knowledge with cloud technologies, e.g. AWS and Azure.
- Excellent software engineering background. High familiarity with software development life cycle. Familiarity with GitHub/Airflow.
- Advanced knowledge of big data technologies, such as programming language (Python, Java), relational (Postgres, mysql), NoSQL (Mongodb), Hadoop (EMR) and streaming (Kafka, Spark).
- Strong problem solving skills with demonstrated rigor in building and maintaining a complex data pipeline.
- Exceptional communication skills and ability to articulate a complex concept with thoughtful, actionable recommendations.
o Acceptance criteria: Shall have working experience in creating architectures for at least
4 projects.
Shall be strong in Object-Oriented Design and Thinking.
Shall be strong in documenting software architecture and communicating the same using UML.
Shall be strong in decomposing larger system into smaller units with clear implementation
dependencies marked for planning.
Shall be strong in effort estimation techniques.
Shall possess strong Problem Solving and Analytical Skills.
Shall be strong in C or C++ programing language experience.
Shall possess good understanding of either Linux or QNX or Android Operating Systems
Shall have a strong experience in developing software using in POSIX APIs.
Shall have a strong understanding on Networking, socket programming and working experience
in at least one of the IPC Frameworks like DBUS, SOME/IP, Binders etc.
Working experience with test framework and automation projects such as Robot Framework.
Knowledge on various software licenses and their compatibilities.
Shall posses strong knowledge in bootloaders, system startup, power management, persistency
management, health management and diagnostics framework for automotive systems.
Shall have a good knowledge on Functional Safety.
Shall have a good understanding hardware and processor internals.
Nice to haves:
Knowledge or working experience in AUTOSAR and / or Adaptive Autosar.
Understanding of GNU/Linux and its device driver framework.
Knowledge in “Secure Programming Techniques” is a plus.
Understanding in ARM Trusted Frameworks, Bootloaders & Virtualization Solutions is a plus.
Involvement in open-source projects in the past is a plus.
Knowledge in Package management and installers.
- Should have good hands-on experience in Informatica MDM Customer 360, Data Integration(ETL) using PowerCenter, Data Quality.
- Must have strong skills in Data Analysis, Data Mapping for ETL processes, and Data Modeling.
- Experience with the SIF framework including real-time integration
- Should have experience in building C360 Insights using Informatica
- Should have good experience in creating performant design using Mapplets, Mappings, Workflows for Data Quality(cleansing), ETL.
- Should have experience in building different data warehouse architecture like Enterprise,
- Federated, and Multi-Tier architecture.
- Should have experience in configuring Informatica Data Director in reference to the Data
- Governance of users, IT Managers, and Data Stewards.
- Should have good knowledge in developing complex PL/SQL queries.
- Should have working experience on UNIX and shell scripting to run the Informatica workflows and to control the ETL flow.
- Should know about Informatica Server installation and knowledge on the Administration console.
- Working experience with Developer with Administration is added knowledge.
- Working experience in Amazon Web Services (AWS) is an added advantage. Particularly on AWS S3, Data pipeline, Lambda, Kinesis, DynamoDB, and EMR.
- Should be responsible for the creation of automated BI solutions, including requirements, design,development, testing, and deployment
Identify key metrics for business growth
Map various business processes, systems to drive key metrics
Design and implement business plans, processes to drive business growth
Set comprehensive goals for performance and growth
Oversee daily operations of the company and the work of executives (IT, Marketing,
Sales, Finance etc.), with respect to implementation of processes
Evaluate performance by analyzing and interpreting data and metrics
Write and submit reports to the CEO in all matters of importance
Requirements
Proven experience as a growth leader in identifying key metrics and underlying process
for business growth
Demonstrable competency in driving process implementation
Understanding of business functions such as HR, Finance, marketing etc.
Working knowledge of data analysis and performance/operation metrics
Working knowledge of budgeting, sales, business development, and strategic planning.
Agri Industry background is preferable
Outstanding organizational and leadership abilities
Excellent interpersonal and public speaking skills
Aptitude in decision-making and problem-solving
MBA from a Tier 1/2 University
Personal Attributes
Proven leadership ability.
Ability to set and manage priorities judiciously.
Excellent written and oral communication skills.
Excellent interpersonal skills.
Ability to articulate ideas to both technical and non-technical audiences.
Exceptionally self-motivated and directed.
Keen attention to detail.
Superior analytical, evaluative, and problem-solving abilities.
Exceptional service orientation.
Ability to motivate in a team-oriented, collaborative environment.
Data Platform engineering at Uber is looking for a strong Technical Lead (Level 5a Engineer) who has built high quality platforms and services that can operate at scale. 5a Engineer at Uber exhibits following qualities:
- Demonstrate tech expertise › Demonstrate technical skills to go very deep or broad in solving classes of problems or creating broadly leverageable solutions.
- Execute large scale projects › Define, plan and execute complex and impactful projects. You communicate the vision to peers and stakeholders.
- Collaborate across teams › Domain resource to engineers outside your team and help them leverage the right solutions. Facilitate technical discussions and drive to a consensus.
- Coach engineers › Coach and mentor less experienced engineers and deeply invest in their learning and success. You give and solicit feedback, both positive and negative, to others you work with to help improve the entire team.
- Tech leadership › Lead the effort to define the best practices in your immediate team, and help the broader organization establish better technical or business processes.
What You’ll Do
- Build a scalable, reliable, operable and performant data analytics platform for Uber’s engineers, data scientists, products and operations teams.
- Work alongside the pioneers of big data systems such as Hive, Yarn, Spark, Presto, Kafka, Flink to build out a highly reliable, performant, easy to use software system for Uber’s planet scale of data.
- Become proficient of multi-tenancy, resource isolation, abuse prevention, self-serve debuggability aspects of a high performant, large scale, service while building these capabilities for Uber's engineers and operation folks.
What You’ll Need
- 7+ years experience in building large scale products, data platforms, distributed systems in a high caliber environment.
- Architecture: Identify and solve major architectural problems by going deep in your field or broad across different teams. Extend, improve, or, when needed, build solutions to address architectural gaps or technical debt.
- Software Engineering/Programming: Create frameworks and abstractions that are reliable and reusable. advanced knowledge of at least one programming language, and are happy to learn more. Our core languages are Java, Python, Go, and Scala.
- Data Engineering: Expertise in one of the big data analytics technologies we currently use such as Apache Hadoop (HDFS and YARN), Apache Hive, Impala, Drill, Spark, Tez, Presto, Calcite, Parquet, Arrow etc. Under the hood experience with similar systems such as Vertica, Apache Impala, Drill, Google Borg, Google BigQuery, Amazon EMR, Amazon RedShift, Docker, Kubernetes, Mesos etc.
- Execution & Results: You tackle large technical projects/problems that are not clearly defined. You anticipate roadblocks and have strategies to de-risk timelines. You orchestrate work that spans multiple teams and keep your stakeholders informed.
- A team player: You believe that you can achieve more on a team that the whole is greater than the sum of its parts. You rely on others’ candid feedback for continuous improvement.
- Business acumen: You understand requirements beyond the written word. Whether you’re working on an API used by other developers, an internal tool consumed by our operation teams, or a feature used by millions of customers, your attention to details leads to a delightful user experience.








