
Support Sales Force Developer
at They provide both wholesale and retail funding. PM1
- 2+ years of experience
- Excellent technical skills on Salesforce or similar applications which include building customized applications using JAVA, MIcroservices, Apex classes and Triggers, Visualforce Pages, Apex Jobs, Lightning Component, Integration, Process Builder or configure components/Flexi pages to implement expected functionality.
- Worked with different tools like Copado, Data loader, Jira, VS Code, Eclipse, SonarQube, etc.
- Able to Manage deployment activities, code review, and deployment using data loader
- Implementation of Entitlement and Milestones, platform events, visual flows, etc.
- Implement web services using REST API/SOAP API.
- Inbound / Outbound Integration.
- Involvement in Solution designing and Estimations.
- Create Email Templates, Page Layouts, Record Types, Custom Fields, Lightning Record Pages, etc.
- Performed pre-deployment and post-deployment activities.
- Experience in Workflows, custom metadata
- Implementing DevOps in Enterprise application
- For Support Developer – Problem-solving mindset
Good to have –
- BFSI industry experience
- Agile project management experience

Similar jobs
Job Responsibilities:
- Managing and maintaining the efficient functioning of containerized applications and systems within an organization
- Design, implement, and manage scalable Kubernetes clusters in cloud or on-premise environments
- Develop and maintain CI/CD pipelines to automate infrastructure and application deployments, and track all automation processes
- Implement workload automation using configuration management tools, as well as infrastructure as code (IaC) approaches for resource provisioning
- Monitor, troubleshoot, and optimize the performance of Kubernetes clusters and underlying cloud infrastructure
- Ensure high availability, security, and scalability of infrastructure through automation and best practices
- Establish and enforce cloud security standards, policies, and procedures Work agile technologies
Primary Requirements:
- Kubernetes: Proven experience in managing Kubernetes clusters (min. 2-3 years)
- Linux/Unix: Proficiency in administering complex Linux infrastructures and services
- Infrastructure as Code: Hands-on experience with CM tools like Ansible, as well as the
- knowledge of resource provisioning with Terraform or other Cloud-based utilities
- CI/CD Pipelines: Expertise in building and monitoring complex CI/CD pipelines to
- manage the build, test, packaging, containerization and release processes of software
- Scripting & Automation: Strong scripting and process automation skills in Bash, Python
- Monitoring Tools: Experience with monitoring and logging tools (Prometheus, Grafana)
- Version Control: Proficient with Git and familiar with GitOps workflows.
- Security: Strong understanding of security best practices in cloud and containerized
- environments.
Skills/Traits that would be an advantage:
- Kubernetes administration experience, including installation, configuration, and troubleshooting
- Kubernetes development experience
- Strong analytical and problem-solving skills
- Excellent communication and interpersonal skills
- Ability to work independently and as part of a team
To prepare placement students with the technical knowledge, skills, and confidence required to succeed in campus recruitment drives, technical interviews, and entry-level job roles in the industry.
Skillset: Aem+node
Experience: 3 to 9 Years
Location: Bangalore , Hyderabad , Pune , Chennai
Notice Period: Immediate to 15 Days
Job Description:
(Kindly mention Mandatory skills in your project)
If you're interested above requirement,
Please share the below-mentioned details :
Current CTC:
Expected CTC:
Current Company:
Notice Period:
If serving NP LWD:
Current Location:
Preferred Location:
Total-experience:
Relevant experience:
Highest qualification with location and passing year:
DOJ(If Offer in Hand from Other company):
Offer in hand with Amount:
Also, send your updated resume( CV) ASAP in word format.

Responsibilities
- Writing clean, high-quality, high-performance, maintainable code
- Develop and support software including applications, database integration, interfaces, and new functionality enhancements
- Responsible for the product development and maintenance
- Analyze and resolve performance bottlenecks
Requirements
- 3+ Years of experience in Web Application Development
- Should excel in working with large-scale applications and frameworks
- Expert in Programming using Angular 8/10/11, Web API
- Should have knowledge in Relational and Non-Relational databases (NO SQL).
- Strong knowledge of SDLC methodologies – Agile/Scrum
- Very good development skills in front end technologies/javascript.
- Good understanding of Object-Oriented Programming, Design Concepts & Unit Testing
- Good understanding of Web Security Concepts


Designation:-Laravel Developer
Job Description:-
- Discussing project aims with the client and development team.
- Designing and building web applications using Laravel.
- Troubleshooting issues in the implementation and debug builds.
- Working with front-end and back-end developers on projects.
- Testing functionality for users and the backend.
- Ensuring that integrations run smoothly.
- Scaling projects based on client feedback.
- Recording and reporting on work done in Laravel.
- Maintaining web-based applications.
- Presenting work in meetings with clients and management.
About the Assistant Director’s responsibilities include:
1. Crafting shooting schedules that incorporates with Team
2. Formulating storyboards that represent salient occasions in each script
3. Ensuring that all applicable filming crew and cast members arrive when needed
4. Positioning, instructing and supporting extras on the set
5. Motivating the filming crew ahead of each take
6. Settling minor discipline-related concerns on the set
7. Supporting and promoting the observance of existing safety protocols
Those who can apply:
1. Education: should done the specialized course in Film making / Film Direction.
2. Experience - Min 0 to 3 years’ experience in field of Video film making/ Documentaries/Ads Films / TVC.
3. Well versed with Marathi & English languages (Additional Advantage for Marathi writing skills)
4. Effective communication skills
5. Able to do proper planning about project generalization with Team & client.
6. Should be located within 5-10 Kms Range of Aundh, Pune.
7. Should have relevant skills and interests
Perks: Certificate, Informal dress code, Job offer
Number of openings - 2
NOTE - This job is not for the feature films / serials. This job is purely based on Audio video advertising sector work.
Exp:8 to 10 years notice periods 0 to 20 days
Job Description :
- Provision Gcp Resources Based On The Architecture Design And Features Aligned With Business Objectives
- Monitor Resource Availability, Usage Metrics And Provide Guidelines For Cost And Performance Optimization
- Assist It/Business Users Resolving Gcp Service Related Issues
- Provide Guidelines For Cluster Automation And Migration Approaches And Techniques Including Ingest, Store, Process, Analyse And Explore/Visualise Data.
- Provision Gcp Resources For Data Engineering And Data Science Projects.
- Assistance With Automated Data Ingestion, Data Migration And Transformation(Good To Have)
- Assistance With Deployment And Troubleshooting Applications In Kubernetes.
- Establish Connections And Credibility In How To Address The Business Needs Via Design And Operate Cloud-Based Data Solutions
Key Responsibilities / Tasks :
- Building complex CI/CD pipelines for cloud native PaaS services such as Databases, Messaging, Storage, Compute in Google Cloud Platform
- Building deployment pipeline with Github CI (Actions)
- Building terraform codes to deploy infrastructure as a code
- Working with deployment and troubleshooting of Docker, GKE, Openshift, and Cloud Run
- Working with Cloud Build, Cloud Composer, and Dataflow
- Configuring software to be monitored by Appdynamics
- Configuring stackdriver logging and monitoring in GCP
- Work with splunk, Kibana, Prometheus and grafana to setup dashboard
Your skills, experience, and qualification :
- Total experience of 5+ Years, in as Devops. Should have at least 4 year of experience in Google could and Github CI.
- Should have strong experience in Microservices/API.
- Should have strong experience in Devops tools like Gitbun CI, teamcity, Jenkins and Helm.
- Should know Application deployment and testing strategies in Google cloud platform.
- Defining and setting development, test, release, update, and support processes for DevOps operation
- Strive for continuous improvement and build continuous integration, continuous development, and constant deployment pipeline (CI/CD Pipeline)
- Excellent understanding of Java
- Knowledge on Kafka, ZooKeeper, Hazelcast, Pub/Sub is nice to have.
- Understanding of cloud networking, security such as software defined networking/firewalls, virtual networks and load balancers.
- Understanding of cloud identity and access
- Understanding of the compute runtime and the differences between native compute, virtual and containers
- Configuration and managing databases such as Oracle, Cloud SQL, and Cloud Spanner.
- Excellent troubleshooting
- Working knowledge of various tools, open-source technologies
- Awareness of critical concepts of Agile principles
- Certification in Google professional Cloud DevOps Engineer is desirable.
- Experience with Agile/SCRUM environment.
- Familiar with Agile Team management tools (JIRA, Confluence)
- Understand and promote Agile values: FROCC (Focus, Respect, Openness, Commitment, Courage)
- Good communication skills
- Pro-active team player
- Comfortable working in multi-disciplinary, self-organized teams
- Professional knowledge of English
- Differentiators : knowledge/experience about
Experience Range |
2 Years - 10 Years |
Function | Information Technology |
Desired Skills |
Must Have Skills:
• Good experience in Pyspark - Including Dataframe core functions and Spark SQL
• Good experience in SQL DBs - Be able to write queries including fair complexity.
• Should have excellent experience in Big Data programming for data transformation and aggregations
• Good at ELT architecture. Business rules processing and data extraction from Data Lake into data streams for business consumption.
• Good customer communication.
• Good Analytical skills
|
Education Type | Engineering |
Degree / Diploma | Bachelor of Engineering, Bachelor of Computer Applications, Any Engineering |
Specialization / Subject | Any Specialisation |
Job Type | Full Time |
Job ID | 000018 |
Department | Software Development |

We are an early stage start-up, building new fintech products for small businesses. Founders are IIT-IIM alumni, with prior experience across management consulting, venture capital and fintech startups. We are driven by the vision to empower small business owners with technology and dramatically improve their access to financial services. To start with, we are building a simple, yet powerful solution to address a deep pain point for these owners: cash flow management. Over time, we will also add digital banking and 1-click financing to our suite of offerings.
We have developed an MVP which is being tested in the market. We have closed our seed funding from marquee global investors and are now actively building a world class tech team. We are a young, passionate team with a strong grip on this space and are looking to on-board enthusiastic, entrepreneurial individuals to partner with us in this exciting journey. We offer a high degree of autonomy, a collaborative fast-paced work environment and most importantly, a chance to create unparalleled impact using technology.
Reach out if you want to get in on the ground floor of something which can turbocharge SME banking in India!
Technology stack at Velocity comprises a wide variety of cutting edge technologies like, NodeJS, Ruby on Rails, Reactive Programming,, Kubernetes, AWS, NodeJS, Python, ReactJS, Redux (Saga) Redis, Lambda etc.
Key Responsibilities
-
Responsible for building data and analytical engineering pipelines with standard ELT patterns, implementing data compaction pipelines, data modelling and overseeing overall data quality
-
Work with the Office of the CTO as an active member of our architecture guild
-
Writing pipelines to consume the data from multiple sources
-
Writing a data transformation layer using DBT to transform millions of data into data warehouses.
-
Implement Data warehouse entities with common re-usable data model designs with automation and data quality capabilities
-
Identify downstream implications of data loads/migration (e.g., data quality, regulatory)
What To Bring
-
3+ years of software development experience, a startup experience is a plus.
-
Past experience of working with Airflow and DBT is preferred
-
2+ years of experience working in any backend programming language.
-
Strong first-hand experience with data pipelines and relational databases such as Oracle, Postgres, SQL Server or MySQL
-
Experience with DevOps tools (GitHub, Travis CI, and JIRA) and methodologies (Lean, Agile, Scrum, Test Driven Development)
-
Experienced with the formulation of ideas; building proof-of-concept (POC) and converting them to production-ready projects
-
Experience building and deploying applications on on-premise and AWS or Google Cloud cloud-based infrastructure
-
Basic understanding of Kubernetes & docker is a must.
-
Experience in data processing (ETL, ELT) and/or cloud-based platforms
-
Working proficiency and communication skills in verbal and written English.


- 1+ year of experience
- Full responsibility for designing and building core back-end modules and system architectures.
- Good understanding of database concepts.
- Experience with at least two programming languages (PHP & GoLang).
- Experience in creating at least one web application.
- Knowledge of MySQL in-depth and basic understanding of MongoDB along with basic understanding about different paradigms of programming.
- Ability to interact and coordinate with a team.
- Should have a basic understanding of APIs.
- Modify code in time to keep code dry and reusable.
- Should have worked in at least one framework in the backend.
- A background in Software Architecture would be preferred.
Requirements
- Develop data-driven web solutions on PHP frameworks (ideally CakePHP), but be undeterred when the need arises to create an application from the ground up as required, maintain, contribute and adhere to our programming best practices and guidelines.
- Work with a team of UI designers, programmers, and server admins, to bring brand new concepts to the market.
- Help improve our code quality through writing unit tests, automation and performing code reviews.
- Immediate Joiner

