
About Wibmo
About
Connect with the team
Similar jobs
Hiring : Salesforce CPQ Developer
Experience : 6+ years
Shift timings : 7:30PM to 3:30AM
Location : India (Remote)
Key skills: CPQ steel brick , Billing sales cloud , LWC ,Apex integrations, Devops, APL, ETL Tool
Design scalable and efficient Salesforce Sales Cloud solutions that meet best practices and business requirements.
Lead the technical design and implementation of Sales Cloud features, including CPQ, Partner Portal, Lead, Opportunity and Quote Management.
Provide technical leadership and mentorship to development teams. Review and approve technical designs, code, and configurations.
Work with business stakeholders to gather requirements, provide guidance, and ensure that solutions meet their needs. Translate business requirements into technical specifications.
Oversee and guide the development of custom Salesforce applications, including custom objects, workflows, triggers, and LWC/ Apex code.
Ensure data quality, integrity, and security within the Salesforce platform. Implement data migration strategies and manage data integrations.
Establish and enforce Salesforce development standards, best practices, and governance processes. Monitor and optimize the performance of Salesforce solutions, including addressing performance issues and ensuring efficient use of resources.
Stay up-to-date with Salesforce updates and new features. Propose and implement innovative solutions to enhance Salesforce capabilities and improve business processes.
Document design, code consistently throughout the design/development process
Diagnose, resolve, and document system issues to support project team.
Research questions with respect to both maintenance and development activities.
Perform post-migration system review and ongoing support.
Prepare and deliver demonstrations/presentations to client audiences, professional seniors/peers
Adhere to best practices constantly around code/data source control, ticket tracking, etc. during the course of an assignment
Skills/Experience:
Bachelor’s degree in Computer Science, Information Systems, or related field.
6+ years of experience in architecting and designing full stack solutions on the Salesforce Platform.
Must have 3+ years of Experience in architecting, designing and developing Salesforce CPQ (SteelBrick CPQ) and Billing solutions.
Minimum 3+ years of Lightning Framework development experience (Aura & LWC).
CPQ Specialist and Salesforce Platform Developer II certification is required.
Extensive development experience with Apex Classes, Triggers, Visualforce, Lightning, Batch Apex, Salesforce DX, Apex Enterprise Patterns, Apex Mocks, Force.com API, Visual Flows, Platform Events, SOQL, Salesforce APIs, and other programmatic solutions on the Salesforce platform.
Experience in debugging APEX CPU Error, SOQL queries Exceptions, Refactoring code and working with complex implementations involving features like asynchronous processing
Clear insight of Salesforce platform best practices, coding and design guidelines and governor limits.
Experience with Development Tools and Technologies: Visual Studio Code, GIT, and DevOps Setup to automate deployment/releases.
Knowledge of integration architecture as well as third-party integration tools and ETL (Such as Informatica, Workato, Boomi, Mulesoft etc.) with Salesforce
Experience in Agile development, iterative development, and proof of concepts (POCs).
Excellent written and verbal communication skills with ability to lead technical projects and manage multiple priorities in a fast-paced environment.
RequiredSkills:
• Minimum of 4-6 years of experience in data modeling (including conceptual, logical and physical data models. • 2-3 years of experience inExtraction, Transformation and Loading ETLwork using data migration tools like Talend, Informatica, Datastage, etc. • 4-6 years of experience as a database developerinOracle, MS SQLor another enterprise database with a focus on building data integration process • Candidate should haveanyNoSqltechnology exposure preferably MongoDB. • Experience in processing large data volumes indicated by experience with BigDataplatforms (Teradata, Netezza, Vertica or Cloudera, Hortonworks, SAP HANA, Cassandra, etc.). • Understanding of data warehousing concepts and decision support systems.
• Ability to deal with sensitive and confidential material and adhere to worldwide data security and • Experience writing documentation for design and feature requirements. • Experience developing data-intensive applications on cloud-based architectures and infrastructures such as AWS, Azure etc. • Excellent communication and collaboration skills.
Job description Position: Data Engineer Experience: 6+ years Work Mode: Work from Office Location: Bangalore Please note: This position is focused on development rather than migration. Experience in Nifi or Tibco is mandatory.Mandatory Skills: ETL, DevOps platform, Nifi or Tibco We are seeking an experienced Data Engineer to join our team. As a Data Engineer, you will play a crucial role in developing and maintaining our data infrastructure and ensuring the smooth operation of our data platforms. The ideal candidate should have a strong background in advanced data engineering, scripting languages, cloud and big data technologies, ETL tools, and database structures.
Responsibilities: • Utilize advanced data engineering techniques, including ETL (Extract, Transform, Load), SQL, and other advanced data manipulation techniques. • Develop and maintain data-oriented scripting using languages such as Python. • Create and manage data structures to ensure efficient and accurate data storage and retrieval. • Work with cloud and big data technologies, specifically AWS and Azure stack, to process and analyze large volumes of data. • Utilize ETL tools such as Nifi and Tibco to extract, transform, and load data into various systems. • Have hands-on experience with database structures, particularly MSSQL and Vertica, to optimize data storage and retrieval. • Manage and maintain the operations of data platforms, ensuring data availability, reliability, and security. • Collaborate with cross-functional teams to understand data requirements and design appropriate data solutions. • Stay up-to-date with the latest industry trends and advancements in data engineering and suggest improvements to enhance our data infrastructure.
Requirements: • A minimum of 6 years of relevant experience as a Data Engineer. • Proficiency in ETL, SQL, and other advanced data engineering techniques. • Strong programming skills in scripting languages such as Python. • Experience in creating and maintaining data structures for efficient data storage and retrieval. • Familiarity with cloud and big data technologies, specifically AWS and Azure stack. • Hands-on experience with ETL tools, particularly Nifi and Tibco. • In-depth knowledge of database structures, including MSSQL and Vertica. • Proven experience in managing and operating data platforms. • Strong problem-solving and analytical skills with the ability to handle complex data challenges. • Excellent communication and collaboration skills to work effectively in a team environment. • Self-motivated with a strong drive for learning and keeping up-to-date with the latest industry trends.
About Us
Welthungerhilfe (WHH) is an international aid agency headquartered in Germany with its vision for “Zero Hunger by 2030”, the second sustainable development goal set by the UN. It has thousands of programs fighting hunger in more than 30 countries. (https://www.welthungerhilfe.org/">https://www.welthungerhilfe.org) (https://www.childgrowthmonitor.org/">https://www.childgrowthmonitor.org).
Child Growth Monitor (CGM) is Welthungerhilfe’s digital innovation project leveraging AI to fight child malnutrition. Around 200 million children around the world are suffering from malnutrition. Malnutrition contributes to one-third of deaths of children under the age of five. In the fight against malnutrition, detection is the first important step, but this is not an easy task: Measuring children with traditional methods is still a complex, slow, and expensive task, frequently resulting in poor data and wrong assessments of the child’s health. Early detection of malnutrition becomes the key to initiating treatment, minimizing the risk of complications,
and saving lives by reducing the chances of death to a great extent.
Our solution to measure children replaces hardware (bulky measuring boards and physical scales) with off-the-shelf cell phones and AI. CGM uses augmented reality-enabled smartphones to record 3D scans of children. Artificial Intelligence is used to predict their height, weight, and middle upper-arm circumference needed to know the nutritional status of the children.
What you would do to help us save millions of children's lives
We are looking for you as an experienced Sr Software Engineer who works collaboratively with the dedicated team of tech specialists (AI, ML & Data Scientists) distributed globally have a shared responsibility for ensuring high quality and
Scalable health care product in an agile approach.
This posting concerns a full-time assignment working for Welthungerhilfe’s CGM in a freelance capacity but Full-time employment (payroll) is scheduled for the end of 2022 latest. We are in the process of setting up CGM as a registered social business from mostly January 2023 and moving all the consultants on CGM payrolls.
Responsibilities
- Designing, developing features, refinement/improving and implementing the CGM product in a secure, well-tested, and performant way.
- De-bugging of the current issues
- Collaborate with Product Management and other stakeholders within Engineering (Frontend, UX, etc.) to maintain a high bar for quality in a fast-paced, iterative environment.
- Advocate for improvements to product quality, security, and performance.
- Solve technical problems of moderate scope and complexity.
- Craft code that meets our internal standards for style, maintainability, and best practices for a high-scale Mobile app environment. Maintain and advocate for these standards through code review.
- Provide mentorship for Junior and Intermediate Engineers on your team to help them grow in their technical responsibilities and remove blockers to their autonomy.
Requirements and skills
- Comfort working in a highly agile, intensely iterative software development process.
- Demonstrated capacity to clearly and concisely communicate about complex technical, architectural, and/or organizational problems and propose thorough iterative solutions.
- Experience with performance and optimization problems and a demonstrated ability to both diagnose and prevent these problems.
- Extensive experience in core Python (3-7yrs)
- Should have worked on Azure or any Cloud and Azure data factory or any similar data platforms
- Experience using Flask/Flutter framework
- Experience working on mobile applications/Web applications (writing flask framework)
- Knowledge of DevOps Pipeline
- Good experience in SQL database
- Working knowledge in Data engineering (ETL pipelines Result Generation Module)
- Basic understanding of front-end technologies such as Vue JS
- Working knowledge in creating predictive models for AI and ML-based features
- Tech /BE in Computer Science with a total of 7 years of software development/Programming experience.
Desired skills
- Good to have an experience in Kubernetes (opensource system)
- Good to have experience working on healthcare mobile apps (predictive analytics and image diagnostics)
- Experience owning a project from concept to production, including proposal, discussion, and execution.
- Proficient understanding of code versioning tools
Soft/Human Skills
- Positive and solution-oriented mindset.
- Excellent communication & interpersonal skills
- Self-motivated and self-managing, with excellent organizational skills.
- Ability to thrive in a fully remote organization
You will:
- Write excellent production code and tests and help others improve in code-reviews
- Analyze high-level requirements to design, document, estimate, and build systems
- Coordinate across teams to identify, resolve, mitigate and prevent technical issues
- Coach and mentor engineers within the team to develop their skills and abilities
- Continuously improve the team's practices in code-quality, reliability, performance, testing, automation, logging, monitoring, alerting, and build processes
You have:
For (Fullstack):
- 2 - 10 Years of experience
- Strong with DS & Algorithms
- Hands on Experience in the Programming languages: JavaScript (React or Angular), Python, SQL.
- Experience with AWS.
For (Geo Team):
- 4 - 10 years of experience
- Experience with Big Data technologies like Hadoop, Spark, Map Reduce, Kafka, etc
- Experience using object-oriented languages (Java, Python)
- Experience in working with different AWS technologies.
- Experience in software design, architecture and development.
- Excellent competencies in data structures & algorithms.
For (Backend):
- 2 - 10 years of experience
- Hands on product development experience using Java/ C++/Python
- Experience with AWS,SQL,GIT
- Strong with Data structures and Algorithms
Additional nice to have skills/certifications:
For Java skill set:
Mockito, Grizzly, Netty, VertX, Jersey / JAX-RS, Swagger / Open API, Nginx, Protocol Buffers, Thrift, Aerospike, Redis, Kinesis, Sed, Awk, Perl
For Python skill set: Data Engineering experience, Athena, Lambda, EMR, Spark, Glue, Step Functions, Hadoop, Kinesis, Orc, Parquet, Perl, Awk, Redshift
For (Data Engineering):
- 2 - 10 years of experience
- Experience with object-oriented/object function scripting languages: Python.
- Experience with AWS cloud services: EC2, RDS, Redshift,S3,Athena, Glue
- Must be proficient in GIT, Jenkins, CICD (Continuous Integration Continuous Deployment)
- Experience in big data technologies like Hadoop, Map Reduce, Spark, etc
- Experience with Amazon Web Services and Dockers
Work Location: Bangalore
Work Days: Monday through Friday
Week Off: Saturday and Sunday
Shift: Day Time
Primary Skills & Responsibilities
• Strong knowledge in Power BI (DAX + Power Query + Power BI Service + Power BI Desktop Visualisations) and Azure Data Storages.
Review all job requirements and specifications required for deploying the solution into the production environment.
Perform various unit/tests as per the checklist on deployment steps with help of test cases and maintain documents for the same.
Work with Lead to resolve all issues within the required timeframe and inform for any delays.
Collaborate with the development team to review new programs for implementation activities and manage communication (if required) with different functions to resolve issues and assist implementation leads to manage production deployments.
Document all issues during the deployment phase and document all findings from logs/during actual deployment and share the analysis.
Review and maintain all technical and business documents. Conduct and monitor software implementation lifecycle and assist/make appropriate customization to all software for clients as per the deployment/implementation guide
Train new members on product deployment, issues and identify all issues in processes and provide solutions for the same.
Ensure project tasks as appropriately updated in JIRA / ticket tool for in-progress/done and raise the issues.
Should take self-initiative to learn/understand the technologies i.e. Vertica SQL, Internal Data integration tool (Athena), Pulse Framework, Tableau.
Flexible to work during non-business hours in some exceptional cases (for a few days) required to meet the client time zones.
Experience on Tools and Technologies preferred:
ETL Tools: Talend or Informatica ,Abinitio,Datastage
BI Tools: Tableau or Jaspersoft or Pentaho or Qlikview experience
Database: Experience in Oracle or SS
Methodology: Experience in SDLC and/or Agile Methodology
- 7 years of hands-on experience on database development
- Very strong and hands-on in Oracle Database and PL/SQL development
- Hands on experience in designing solutions and developing for data migration projects using Oracle PL/SQL
- Experience with Oracle Argus Safety, ArisG and other Safety/Clinical systems
- Working experience in development of ETL process, DB Design and Data Structures
- Excellent knowledge of Relational Databases, Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based) , Object type, Stored Procedures, Functions, Packages and Triggers, Dynamic SQL, Set transaction, pl/sql Cursor variables with Ref Cursor
Excellent written and verbal communication skills
Reporting and Business Intelligence
Minimum of 8 to 10 years of experience in Report building and Business Intelligence with
Profound knowledge on Oracle BI Publisher and BI Analytics with Oracle Fusion HCM domain especially in Core HR, Payroll and Absences modules
Primary skill:
- Experience generating BI publisher reports, dashboards and drill-down reports
- BI Publisher (RTF design/ eText/ Scheduling/ Parameter Handling/Bursting/backup and migration of reports to different pods)
- Design Oracle BI Publisher Data Models, Data sets and Templates/sub-templates
- Strong PL/SQL and Advanced PL/SQL experience
- Solid understanding of performance tuning best practices and experience improving end-to-end processing times
- Knowledge on the data models related to Fusion HCM (Core HR, Absence and Payroll modules)
Nice to have skills:
- Experience generating BI publisher reports, dashboards and drill-down reports
- Strong OBI Administrator activity
- Create mobile applications with Oracle Business Intelligence Mobile App Designer
- Use BI Mobile to access BI Content
- Create and modify Interactive Dashboards
- Knowledge of additional reporting tools viz. HCM Extract / OTBI / Analysis Dashboard would be an added advantage
Behavioural and Process Skills
- Experience working in Agile teams
- Self-motivated and self-directed abilities to prioritize and execute tasks in a high-pressure environment with "time-critical" deadlines
- Proven analytical, evaluative, and problem-solving abilities
- Possesses a team and customer service provision orientation
Job Role – SDE2
Duration – 12 months
Location – HYD
Key Skills:
- 3- 5 years
- Understanding databases and building reports
- Ideal candidate should have experience in Cosmos, Kusto and powerbi
- Mandatory experience in some Database query language(such as SQL)








