
We are looking for a bright and exceptional Engineering Manager to join our Hyderabad based Technology team. The role involves building complex next generation product used by our clients and architecting solutions to support new technical and business initiatives.
What you’ll do: • Manage, design, execute and take complete responsibility for the delivery and maintenance of software projects/products • Help the team translate the business requirements into R&D tasks • Work with business groups to outline project deliverables and manage the roadmap of the R&D tasks • Work with Technical Relationship managers to understand the client initiated R&D requests • Act as a point of contact for managing and driving production defects to resolution • Tailor processes to help manage time-sensitive issues and bring them to appropriate closure • Engage and manage team of highly talented technologists, and aid in them grow professionally with regular mentoring
What you’ll need: • A bachelor’s degree in Computer Science with 8+ years of experience, fintech domain is a plus • Demonstrated track record of end-to-end delivery of enterprise-grade software • Strong technology acumen, knowledge of software engineering process, design knowledge and architecture intelligence • Superior project management skills to ensure high-quality and timely solution delivery
- Attention to detail and quality, and the ability to work well in and across teams • Ability to advocate & influence multiple stakeholders • Excellent analytical and reasoning skills • Ability to learn new domains and deliver output • Experience leading a team of highly skilled engineers • Strong communication skills
Members of the Arcesium Company Group do not discriminate in employment matters on the basis of sex, race, colour, caste, creed, religion, pregnancy, national origin, age, military service eligibility, veteran status, sexual orientation, marital status, disability, or any other protected class.

Similar jobs
Job Title: PySpark/Scala Developer
Functional Skills: Experience in Credit Risk/Regulatory risk domain
Technical Skills: Spark ,PySpark, Python, Hive, Scala, MapReduce, Unix shell scripting
Good to Have Skills: Exposure to Machine Learning Techniques
Job Description:
5+ Years of experience with Developing/Fine tuning and implementing programs/applications
Using Python/PySpark/Scala on Big Data/Hadoop Platform.
Roles and Responsibilities:
a) Work with a Leading Bank’s Risk Management team on specific projects/requirements pertaining to risk Models in
consumer and wholesale banking
b) Enhance Machine Learning Models using PySpark or Scala
c) Work with Data Scientists to Build ML Models based on Business Requirements and Follow ML Cycle to Deploy them all
the way to Production Environment
d) Participate Feature Engineering, Training Models, Scoring and retraining
e) Architect Data Pipeline and Automate Data Ingestion and Model Jobs
Skills and competencies:
Required:
· Strong analytical skills in conducting sophisticated statistical analysis using bureau/vendor data, customer performance
Data and macro-economic data to solve business problems.
· Working experience in languages PySpark & Scala to develop code to validate and implement models and codes in
Credit Risk/Banking
· Experience with distributed systems such as Hadoop/MapReduce, Spark, streaming data processing, cloud architecture.
- Familiarity with machine learning frameworks and libraries (like scikit-learn, SparkML, tensorflow, pytorch etc.
- Experience in systems integration, web services, batch processing
- Experience in migrating codes to PySpark/Scala is big Plus
- The ability to act as liaison conveying information needs of the business to IT and data constraints to the business
applies equal conveyance regarding business strategy and IT strategy, business processes and work flow
· Flexibility in approach and thought process
· Attitude to learn and comprehend the periodical changes in the regulatory requirement as per FED
Key Responsibilities :
Design and Development : Develop robust and scalable robotic applications
using ROS2. Implement software for various robotic systems, ensuring high
performance and reliability.
Hand-on with developing ROS2 nodes, Services/Clients, Publishers/Subscriber.
Lead and develop path/motion planning algorithms that include route planning,
trajectory optimization, decision making, and open space planning. Good
understandings of Robot dynamics, kinematics and modeling.
System Integration : Integrate sensors, actuators, and other hardware
components with robotic systems. Ensure seamless communication between
hardware and software layers. Experienced on integration with perception
sensors such as IMU, GPS, Stereo Cameras, Lidar, Radar, and various other
sensors.
URDF Modeling : Create and maintain accurate URDF models for robotic
systems. Ensure models accurately represent the physical configuration and
kinematics of the robots.
Algorithm Implementation : Implement and optimize algorithms for perception,
localization, mapping, navigation, and control.
Responsibilities:
- Troubleshoot and resolve complex technical issues related to Active Directory, Windows System Administration, Mac Support, Mobile Device Management, Windows Administration, Desktop Support, System Administration, Technical Support, Jamf, Intune, Azure Active Directory, and Confluence.
- Collaborate with other teams to ensure timely resolution of customer issues.
- Document solutions and knowledge articles for future reference.
- Participate in on-call rotation for after-hours support as needed.
What we are looking for :
- Bachelor's degree in Computer Science or related field preferred.
- Proficient in Active Directory, Windows System Administration, Mac Support, Mobile Device Management, Windows Administration, Desktop Support, System Administration, Technical Support, JAMF, Intune, Azure Active Directory, and Confluence.
- Familiarity with Microsoft Office Suite and other relevant software applications.
- Strong problem-solving skills and ability to work under pressure.
- Excellent communication skills both verbally and written.
We are a thriving software development agency seeking a skilled Sales and Lead Generation Specialist to help us expand our business on Upwork and Fiverr. Your primary role will be to generate new sales leads, manage our profiles, and write engaging content to attract potential clients.
Key Responsibilities:
• Generate sales and leads by effectively utilizing Upwork and Fiverr.
• Manage and enhance our profiles on both platforms to increase visibility.
• Create compelling content for our profiles, highlighting our services and success stories.
• Communicate with potential clients to understand their needs and propose suitable marketing solutions.
• Stay updated with platform trends to maximize opportunities for lead generation.
Requirements:
• Proven experience in sales and lead generation, specifically on Upwork and Fiverr.
• Strong content creation and management skills.
• Excellent communication and negotiation skills.
• Familiarity with the IT industry.
What We Offer:
• Opportunity to work with a dynamic software development agency with a wide scope of projects.
• A low monthly fixed retainer cost with the potential for high commissions. We offer up to 20% commission on all business received from Fiverr and Upwork, depending on the size and scope of the work.
• A supportive and collaborative work environment.
We are looking for someone who is enthusiastic, results-driven, and has a keen eye for identifying and seizing business opportunities. If you are ready to contribute to our growth and thrive in a fast-paced, exciting environment, we would love to hear from you!
● Take responsibility for developing product features
● Engage with Product Management and Business to drive the agenda, set your
priorities and deliver awesome product features to keep the platform ahead
of market scenarios.
● Design and develop using Node.js/Feather.js, React, AWS ML stack
● Develop and utilize your skills as a mentor and leader. Grow your team’s capacity by mentoring other engineers and interviewing candidates.
Must-haves
-
Strong proficiency in NodeJS and JavaScript.
-
Strong proficiency in top NodeJS frameworks like Express.JS (Simply Express), Feathers.JS, StrongLoop, Koa.JS, or Hapi.
-
Good knowledge of SQL/NoSQL databases. Ability to develop services with polyglot persistence.
-
Basic working knowledge of front-end technologies like HTML5 and CSS3.
-
A strong presence of mind, excellent language, and communication skills.
We are looking for a Senior Platform Engineer responsible for handling our GCP/AWS clouds. The candidate will be responsible for automating the deployment of cloud infrastructure and services to support application development and hosting (architecting, engineering, deploying, and operationally managing the underlying logical and physical cloud computing infrastructure).
Job Description:
● Collaborate with teams to build and deliver solutions implementing serverless, microservice-based, IaaS, PaaS, and containerized architectures in GCP/AWS environments.
●Responsible for deploying highly complex, distributed transaction processing systems.
● Work on continuous improvement of the products through innovation and learning. Someone with a knack for benchmarking and optimization
● Hiring, developing, and cultivating a high and reliable cloud support team ● Building and operating complex CI/CD pipelines at scale
● Work with GCP Services, Private Service Connect, Cloud Run, Cloud Functions, Pub/Sub, Cloud Storage, Networking
● Collaborate with Product Management and Product Engineering teams to drive excellence in Google Cloud products and features.
● Ensures efficient data storage and processing functions by company security policies and best practices in cloud security.
● Ensuring scaled database setup/monitoring with near zero downtime
- Create Exclusive Distributors (minimum 4 in 4 months) in your respective areas of operation
- To work directly with the distributor and the sales executives in designing and implementing key marketing campaigns.
- To actively engage yourself to collect inventory, stocks and work reports of the distributor in a
daily/weekly basis.
- To look after smooth functioning and supply chain of the authorized distributor under your
periphery.
- To oversee sales executive schedule and plan them accordingly.
- To facilitate cross-functional communications among project stakeholders.
- To actively engage yourself with core marketing policies required for business development.
- To look after the grievances and claims arising from time to time maintaining proper
coordination with the company.
- To actively follow other marketing strategies and instructions as and when instructed by the company.
We are actively seeking a Senior Data Engineer experienced in building data pipelines and integrations from 3rd party data sources by writing custom automated ETL jobs using Python. The role will work in partnership with other members of the Business Analytics team to support the development and implementation of new and existing data warehouse solutions for our clients. This includes designing database import/export processes used to generate client data warehouse deliverables.
- 2+ Years experience as an ETL developer with strong data architecture knowledge around data warehousing concepts, SQL development and optimization, and operational support models.
- Experience using Python to automate ETL/Data Processes jobs.
- Design and develop ETL and data processing solutions using data integration tools, python scripts, and AWS / Azure / On-Premise Environment.
- Experience / Willingness to learn AWS Glue / AWS Data Pipeline / Azure Data Factory for Data Integration.
- Develop and create transformation queries, views, and stored procedures for ETL processes, and process automation.
- Document data mappings, data dictionaries, processes, programs, and solutions as per established standards for data governance.
- Work with the data analytics team to assess and troubleshoot potential data quality issues at key intake points such as validating control totals at intake and then upon transformation, and transparently build lessons learned into future data quality assessments
- Solid experience with data modeling, business logic, and RESTful APIs.
- Solid experience in the Linux environment.
- Experience with NoSQL / PostgreSQL preferred
- Experience working with databases such as MySQL, NoSQL, and Postgres, and enterprise-level connectivity experience (such as connecting over TLS and through proxies).
- Experience with NGINX and SSL.
- Performance tune data processes and SQL queries, and recommend and implement data process optimization and query tuning techniques.








