• Understand the Business requirements and convert them into technical requirements.
• Ability to debug the system for certain behavior of the feature(s) and explain it to the Users Perform fit/gap analysis to evaluate each functional area in a business process to achieve specific goal(s).
• Identify/modify standard/custom reports that are needed to produce Statutory, Management, Reconciliation reports and others.
• Develop/maintain interfaces from/to Oracle General Ledger, Account Payable and Account Receivables.
• Provide requirements to third party applications that interface with Oracle General Ledger
• Create ad hoc reports as per the requirements.
• Create, test and implement code changes and integrate them with existing programs as needed.
• IT Resource for Oracle Financials/HR related projects coordinate meetings/communications with the Oracle Financials/HR User Communityo Mentor Employees as needed.
• Provide timely and effective reporting on status of projects.

About codersbrain
About
Connect with the team
Similar jobs
Roles & Responsibilities:
Identify and generate high-quality leads through cold calling. Manage the sales pipeline, track opportunities, and monitor progress. Build and maintain strong relationships with potential and existing clients. Maintain accurate and up-to-date records of leads and customer interactions.
Desired Profile:
1-2 years of experience in lead generation or sales.
Excellent communication and interpersonal skills.
Ability to work independently and as part of a team
Duration: 6 months with possible extension
Location: Remote
Notice Period: Immediate Joiner Preferred
Experience: 4-6 Years
Requirements:
- B Tech/M Tech in Computer Science or equivalent from a reputed college with a minimum of 4 – 6 years of experience in a Product Development Company
- Sound knowledge and application of algorithms and data structures with space and me complexities
- Strong design skills involving data modeling and low-level class design
- Good knowledge of object-oriented programming and design patterns
- Proficiency in Python, Java, and Golang
- Follow industry coding standards and be responsible for writing maintainable/scalable/efficient code to solve business problems
- Hands-on experience of working with Databases and the Linux/Unix platform
- Follow SDLC in an agile environment and collaborate with multiple cross-functional teams to drive deliveries
- Strong technical aptitude and good knowledge of CS fundamentals
What will you get to do here?
- Coming up with best practices to help the team achieve their technical tasks and continually thrive in improving the technology of the product/team.
- Driving the adoption of best practices & regular Participation in code reviews, design reviews, and architecture discussions.
- Experiment with new & relevant technologies and tools, and drive adoption while measuring yourself on the impact you can create.
- Implementation of long-term technology vision for your team.
- Creating architectures & designs for new solutions around existing/new areas
- Decide on technology & tool choices for your team & be responsible for them.
• Charting learning journeys with knowledge graphs.
• Predicting memory decay based upon an advanced cognitive model.
• Ensure content quality via study behavior anomaly detection.
• Recommend tags using NLP for complex knowledge.
• Auto-associate concept maps from loosely structured data.
• Predict knowledge mastery.
• Search query personalization.
Requirements:
• 6+ years experience in AI/ML with end-to-end implementation.
• Excellent communication and interpersonal skills.
• Expertise in SageMaker, TensorFlow, MXNet, or equivalent.
• Expertise with databases (e. g. NoSQL, Graph).
• Expertise with backend engineering (e. g. AWS Lambda, Node.js ).
• Passionate about solving problems in education
We are Seeking:
1. AWS Serverless, AWS CDK:
Proficiency in developing serverless applications using AWS Lambda, API Gateway, S3, and other relevant AWS services.
Experience with AWS CDK for defining and deploying cloud infrastructure.
Knowledge of serverless design patterns and best practices.
Understanding of Infrastructure as Code (IaC) concepts.
Experience in CI/CD workflows with AWS CodePipeline and CodeBuild.
2. TypeScript, React/Angular:
Proficiency in TypeScript.
Experience in developing single-page applications (SPAs) using React.js or Angular.
Knowledge of state management libraries like Redux (for React) or RxJS (for Angular).
Understanding of component-based architecture and modern frontend development practices.
3. Node.js:
Strong proficiency in backend development using Node.js.
Understanding of asynchronous programming and event-driven architecture.
Familiarity with RESTful API development and integration.
4. MongoDB/NoSQL:
Experience with NoSQL databases and their use cases.
Familiarity with data modeling and indexing strategies in NoSQL databases.
Ability to integrate NoSQL databases into serverless architectures.
5. CI/CD:
Ability to troubleshoot and debug CI/CD pipelines.
Knowledge of automated testing practices and tools.
Understanding of deployment automation and release management processes.
Educational Background: Bachelor's degree in Computer Science, Engineering, or a related field.
Certification(Preferred-Added Advantage):AWS certifications (e.g., AWS Certified Developer - Associate)
QA Engineer will work closely with the entire project team to review the overall functional and non-functional requirements, while keeping a user's perspective in mind.
- Reviewing quality specifications and technical design documents to provide timely and meaningful feedback.
- Creating detailed, comprehensive and well-structured test plans and test cases.
- Estimating, prioritizing, planning and coordinating quality testing activities.
- Managing a small team of QA engineers and working on the project with the team.
Required Candidate profile
Primary Skills:
Strong working knowledge of Typescript, MongoDB, Express.
Strong proficiency in TypeScript, Javascript including DOM manipulation and the JavaScript object model, ES6.
Strong understanding of NodeJS fundamentals.
Knowledge and experience working with PostgreSQL is a major plus.
Experience with ORM libraries.
Familiarity with RESTful APIs.
Experience in troubleshooting and RCA of Production issues Analyze logs in Kibana Elastic Search.
Knowledge of tools like Git, Github, JIRA, Cucumber, Jasmine, and others that make coding more efficient and easier to share.
Familiarity of working in either the AWS Cloud or Azure including Docker + Kubernetes based microservice deployment.
Secondary Skills:
Good communication and design skills Experience in handling teams.
Solid back-end software development experience.
Ability to understand business requirements and translate them into technical requirements.
Experience working in an Agile environment.
Qualification:
Good experience as Software Developer.
Prior experience in a technical leadership or developer position.
Back End Developers/ API Web Services Experience Mandatory.
NodeJS is mandatory.
One RDBMS and one document database mandatory.
Required Skills:
- 2+ year of experience in Development in JAVA technology.
- Strong Java Basics
- Spring Boot or Spring MVC
- Hands on experience on Relational Databases (SQL query or Hibernate) or MongoDB (JSON parsing)
- Proficient in REST API development
- Good at problem solving
Good to Have Skills:
- 2+ years of experience in using Java
- Good understanding of data structures and algorithms.
- Excellent analytical and problem solving skills.
- Ability to work in a fast paced internet start-up environment.
- Experience in technical mentor ship/coaching is highly desirable.
- We are looking for an experienced data engineer to join our team.
- The preprocessing involves ETL tasks, using pyspark, AWS Glue, staging data in parquet formats on S3, and Athena
To succeed in this data engineering position, you should care about well-documented, testable code and data integrity. We have devops who can help with AWS permissions.
We would like to build up a consistent data lake with staged, ready-to-use data, and to build up various scripts that will serve as blueprints for various additional data ingestion and transforms.
If you enjoy setting up something which many others will rely on, and have the relevant ETL expertise, we’d like to work with you.
Responsibilities
- Analyze and organize raw data
- Build data pipelines
- Prepare data for predictive modeling
- Explore ways to enhance data quality and reliability
- Potentially, collaborate with data scientists to support various experiments
Requirements
- Previous experience as a data engineer with the above technologies









