

- 4+ yrs of experience having strong fundamentals on Windchill Customization & Configurations, Reporting Framework Customization, Workflow Customization , Customer handling
- Strong Customization background around Form Processors, Validator, data utilities, Form Controllers etc.
- Strong Programming skills in Java/J2EE technologies – JavaScript, GWT, JQuerry, XMLs, JSPs, SQL etc.
- Deep Knowledge in Windchill architecture
- Experience in atleast one full lifecycle PLM implementation with Windchill.
- Should have strong coding skills in Windchill development and Customization, ThingWorx Navigate Development (Mandatory), Thing Worx Architecture Configurations Mashup creation, ThingWorx and Windchill Upgrade
- Should have Build and Configuration management (Mandatory) - HPQC \JIRA\Azure\SVN\GITHUB \ Ant
- Knowledge & Experience in Build and Release process
- Having worked on custom upgrade will be a plus.
- Understanding of application s development environment, Database, data management and infrastructure capabilities and constraints. Understanding of Database administration, Database design and performance Tuning
- Follow Quality processes for tasks with appropriate reviews. Participate in sharing knowledge within the team.

Similar jobs
This requirement is for Data Engineer in Gurugram for Data Analytics Project.
Building ETL/ELT pipelines of data from various sources using SQL/Python/Spark
Ensuring that data are modelled and processed according to architecture and requirements both
functional and non-functional
Understanding and implementing required development guidelines, design standards and best
practices Delivering right solution architecture, automation and technology choices
Working cross-functionally with enterprise architects, information security teams, and platform
teams Suggesting and implementing architecture improvements
Experience with programming languages such as Python or Scala
Knowledge of Data Warehouse, Business Intelligence and ETL/ELT data processing issues
Ability to create and orchestrate ETL/ELT processes in different tools (ADF, Databricks
Workflows) Experience working with Databricks platform: workspace, delta lake, workflows, jobs,
Unity Catalog Understanding of SQL and relational databases
Practical knowledge of various relational and non-relational database engines in the cloud (Azure
SQL Database, Azure Cosmos DB, Microsoft Fabric, Databricks)
Hands-on experience with data services offered by Azure cloud
Knowledge of Apache Spark (Databricks, Azure Synapse Spark Pools)
Experience in performing code review of ETL/ELT pipelines and SQL queries Analytical approach
to problem solvin
Responsibilities:
- Designing cloud infrastructure for scalability, performance, availability, and security.
- Involving in migrating existing applications and data to cloud platforms, designing the migration plan, assessing risks, and ensuring a smooth transition to the cloud.
- Designing and implementing security measures for cloud infrastructure.
- Optimizing cloud infrastructure costs by designing and identifying areas where costs can be reduced.
Eligibility:
- Experience in cloud computing concepts, architectures, and deployment models.
- Deep insights into AWS
- Programming languages experience, such as Java, Python, Ruby, and others.
- Proficient in designing and deploying cloud infrastructure, including computing, storage, networking, and security services.
- Involvement in cloud security principles and practices, including data protection, access control, network security, and compliance.
Hello,
I am Prashant, Business Head at Jal Electricals
We “Jal Electricals” are famous and leading Supplier, Importer and Distributor with Digital Door Lock product lines, I need a professional Digital Marketing Agent to optimize and run extensive FB ads who know how to create profitable campaigns.
The audience and target are developed countries. There are many high-rise apartment buildings where it is easiest for us to bring this Digital Door Lock product closer to them
Our daily budget ranges from $500 to $900, and we can spend up to $200,000 per month as this is a large amount that we allocate for a single advertising campaign. Therefore, we need to ensure that the marketer has thoroughly researched the project before starting the implementation.
My product needs a marketing campaign to increase sales, so I would like to hire a Facebook advertising company with high expertise and experience in this field.
I would like to hire a Digital Marketer with experience in advertising to run Facebook ad campaigns for our business.
- Create Exclusive Distributors (minimum 4 in 4 months) in your respective areas of operation
- To work directly with the distributor and the sales executives in designing and implementing key marketing campaigns.
- To actively engage yourself to collect inventory, stocks and work reports of the distributor in a
daily/weekly basis.
- To look after smooth functioning and supply chain of the authorized distributor under your
periphery.
- To oversee sales executive schedule and plan them accordingly.
- To facilitate cross-functional communications among project stakeholders.
- To actively engage yourself with core marketing policies required for business development.
- To look after the grievances and claims arising from time to time maintaining proper
coordination with the company.
- To actively follow other marketing strategies and instructions as and when instructed by the company.

Job Description
- 4-5 years of overall experience and 2+ years of experience as full stack developer with skills on Python, Javascript and databases.
- Excellent programming skills and hands-on experience in Python, Java Script, HTML, and CSS.
- Experience of working in Data Science domain is an advantage, but not mandatory.
- Ability to positively contribute to a team and willingness to quickly complete large volumes of work with high quality.
- Excellent analytical and critical thinking skills.
- Excellent verbal and written communication skills.
- BE/BTech or MCA Degree or any graduate with desired skills.



● You have a minimum of 7 years of experience building high-performance consumerfacing mobile applications at product companies of a decent scale
● You have a keen eye for mobile architecture and able to assist your team in making the right choices for every project
● You have previous experience building react native applications from scratch. This could be an added advantage.
● You have a passion for mentoring and helping people on your team grow and achieve theirgoals.
● You practice test-driven development.
● You are familiar with both Android and iOS design patterns, and GraphQL.
● You have some exposure to native app development in Swift, Kotlin, or Java.
● You have strong knowledge of software development fundamentals, including a
relevant background in computer science fundamentals and agile development
methodologies.
● You are an excellent collaborator & communicator. You know that startups are a team sport. You listen to others, aren’t afraid to speak your mind and always try to ask the right questions.
● You are excited by the prospect of working in a distributed team and company.
Location: We are open to candidates working from anywhere in India/across the globe. At the moment, however, like most teams, we are fully remote.
- 1-5 years of experience in building and maintaining robust data pipelines, enriching data, low-latency/highly-performance data analytics applications.
- Experience handling complex, high volume, multi-dimensional data and architecting data products in streaming, serverless, and microservices-based Architecture and platform.
- Experience in Data warehousing, Data modeling, and Data architecture.
- Expert level proficiency with the relational and NoSQL databases.
- Expert level proficiency in Python, and PySpark.
- Familiarity with Big Data technologies and utilities (Spark, Hive, Kafka, Airflow).
- Familiarity with cloud services (preferable AWS)
- Familiarity with MLOps processes such as data labeling, model deployment, data-model feedback loop, data drift.
Key Roles/Responsibilities:
- Act as a technical leader for resolving problems, with both technical and non-technical audiences.
- Identifying and solving issues with data pipelines regarding consistency, integrity, and completeness.
- Lead data initiatives, architecture design discussions, and implementation of next-generation BI solutions.
- Partner with data scientists, tech architects to build advanced, scalable, efficient self-service BI infrastructure.
- Provide thought leadership and mentor data engineers in information presentation and delivery.
- Total Experience of 7-10 years and should be interested in teaching and research
- 3+ years’ experience in data engineering which includes data ingestion, preparation, provisioning, automated testing, and quality checks.
- 3+ Hands-on experience in Big Data cloud platforms like AWS and GCP, Data Lakes and Data Warehouses
- 3+ years of Big Data and Analytics Technologies. Experience in SQL, writing code in spark engine using python, scala or java Language. Experience in Spark, Scala
- Experience in designing, building, and maintaining ETL systems
- Experience in data pipeline and workflow management tools like Airflow
- Application Development background along with knowledge of Analytics libraries, opensource Natural Language Processing, statistical and big data computing libraries
- Familiarity with Visualization and Reporting Tools like Tableau, Kibana.
- Should be good at storytelling in Technology
Qualification: B.Tech / BE / M.Sc / MBA / B.Sc, Having Certifications in Big Data Technologies and Cloud platforms like AWS, Azure and GCP will be preferred
Primary Skills: Big Data + Python + Spark + Hive + Cloud Computing
Secondary Skills: NoSQL+ SQL + ETL + Scala + Tableau
Selection Process: 1 Hackathon, 1 Technical round and 1 HR round
Benefit: Free of cost training on Data Science from top notch professors





