Senior Dynamics 365 Business Central/NAV Techno Functional Consultant
at Senior Microsoft Dynamics 365 Business Central Techno Functi
Job Description
Location : Hyderabad
Timings : 1.30 to 10.30 PM IST
Primary Responsibilities:
- Technical design, development and testing of core Dynamics 365 Business Central Apps
- Work with internal stakeholders to understand the technical requirements and scope out modifications and custom development
- Convert functional requirements to technical specifications and design documents
- Ensure development best practices are adhered to by other team members
- Conduct Code Reviews and mentor other team members
Qualifications:
- Minimum of 10+ years of Dynamics NAV Development and/or Dynamics 365 Business Central experience
- Strong knowledge of C\AL, and experience with AL
- Understanding of Events and Subscribers
- Dynamics NAV Web Services/API experience to integrate with 3rd party software’s
- Working Experience on Integrations
- Real-time Knowledge on RDLC Reports
- Strong Knowledge in Purchase, Sales. Finance is plus
- Strong understanding of the SDLC
- Excellent communication, planning and organization skills
- Good working experience with US customers/teams
Preferred Qualifications:
- Certificate in Dynamics NAV development (or equivalent in Dynamics 365 BC)
Similar jobs
Key Responsibilities:
1. Threat Research: Work on researching emerging cyber threats specifically. You will monitor threat actor activities, study their tactics, techniques, and procedures (TTPs), and help identify potential risks.
2. Alert Triage and Incident Analysis: Support the analysis of security alerts generated by our in-house platform. You will work alongside the team to identify critical issues and provide timely
intelligence to help mitigate threats.
3. Data Collection and OSINT: Assist in gathering and analyzing data using Open Source Intelligence (OSINT) methodologies. You will help collect relevant information to support ongoing threat investigations.
4. Report Preparation: Contribute to the preparation of threat intelligence reports for internal and external stakeholders. You will learn how to convey complex technical information in a clear and
actionable manner.
5. SOP Development: Collaborate with the team to develop and refine Standard Operating Procedures (SOPs) for systematic threat analysis. Your input will help ensure that our procedures are efficient and scalable.
6. Cross-functional Collaboration: Work closely with various teams, including product development and data acquisition, to support the integration of new intelligence sources and improve the effectiveness of our threat intelligence platform.
Key Qualifications:
Educational Background: Completed a degree in Cybersecurity, Computer Science, Information Technology, or a related field.
Basic Knowledge of Cybersecurity: A foundational understanding of cybersecurity concepts, including web application security, threat analysis, and vulnerability assessment.
Familiarity with OSINT: Basic knowledge of Open Source Intelligence (OSINT) tools and methodologies for data collection.
Technical Skills: Familiarity with scripting languages such as Python, Ruby, or GO is a plus.
Experience with automation and data analysis tools will be advantageous.
Communication Skills: Strong written and verbal communication skills, with the ability to learn how to convey technical findings effectively.
Problem-Solving and Adaptability: A proactive attitude with strong problem-solving skills. You should be comfortable learning in a fast-paced and dynamic environment.
Additional Skills:
Interest in Cybersecurity Challenges: Participation in bug bounty programs, Capture The Flag (CTF) challenges, or cybersecurity competitions is a plus.
Willingness to Learn: A keen interest in developing skills in threat intelligence, threat actor profiling, and behavioral analysis.
Preferably -4 to 7 years of experience of selling any type of
Boilers/ Electric boiler/ super heaters.
B. Assist company in application discussion with clients and
generate enquiries.
D. Convince client of superior technology and sell Electric type.
C. Fill up questionnaire during bidding stage for major bids, projects
& interact with client & team for quotes & all follow up.
D. Make weekly / monthly reports of client visits, review with
Management, team on enhancing market share, maintain list of
RFQ, quotes,Orders etc.
E. Make technical & application selling, quote preparation & follow
up with clients of all nature and starategise solution to ward
competition.
F. Take initiative in solving problems at site for products supplied
when called for.
G. Ensure guarantee runs, educate clients on proper running of our
products & minimize and solve guarantee claims.
J. Participate in exhibitions & educate clients on our product
features, screen web portal and download and bid new opportunities,
arrange vendor registration for domestic & international markets,
constant follow up with foreign agents.
Job Description
We are looking for an experienced engineer with superb technical skills. Primarily be responsible for architecting and building large scale data pipelines that delivers AI and Analytical solutions to our customers. The right candidate will enthusiastically take ownership in developing and managing a continuously improving, robust, scalable software solutions.
Although your primary responsibilities will be around back-end work, we prize individuals who are willing to step in and contribute to other areas including automation, tooling, and management applications. Experience with or desire to learn Machine Learning a plus.
Skills
- Bachelors/Masters/Phd in CS or equivalent industry experience
- Demonstrated expertise of building and shipping cloud native applications
- 5+ years of industry experience in administering (including setting up, managing, monitoring) data processing pipelines (both streaming and batch) using frameworks such as Kafka Streams, Py Spark, and streaming databases like druid or equivalent like Hive
- Strong industry expertise with containerization technologies including kubernetes (EKS/AKS), Kubeflow
- Experience with cloud platform services such as AWS, Azure or GCP especially with EKS, Managed Kafka
- 5+ Industry experience in python
- Experience with popular modern web frameworks such as Spring boot, Play framework, or Django
- Experience with scripting languages. Python experience highly desirable. Experience in API development using Swagger
- Implementing automated testing platforms and unit tests
- Proficient understanding of code versioning tools, such as Git
- Familiarity with continuous integration, Jenkins
Responsibilities
- Architect, Design and Implement Large scale data processing pipelines using Kafka Streams, PySpark, Fluentd and Druid
- Create custom Operators for Kubernetes, Kubeflow
- Develop data ingestion processes and ETLs
- Assist in dev ops operations
- Design and Implement APIs
- Identify performance bottlenecks and bugs, and devise solutions to these problems
- Help maintain code quality, organization, and documentation
- Communicate with stakeholders regarding various aspects of solution.
- Mentor team members on best practices
Requirements:
- Retrieve Data from Databases
SSRS developers optimize SQL server queries to retrieve data efficiently and quickly. As
teams submit requests for different ways of doing things, SSRS developers work with them to find solutions that meet their needs.
- Design Data Solutions
To create and design best-practice data warehouse solutions that support business reporting tasks. This is achieved using different programs and software, such as Microsoft Excel, MS SQL Server, and Tableau.
- Create and Maintain Reporting Processes
To maintain a record of the different reporting processes in place. Evolve ways & means for a more effective data management.
- Resolve IT and Data Issues
- Provide Delivery Plans and Estimates for Projects
- Work independently, good communication and Excel Skills.
- Extensive experience in loading data using SF data loader.
- Expert in SALESFORCE configurations and admin activities
- Integration development experience (REST / SOAP)
- Should be expert in working with flows, process builder, Platform Events and APEX
Roles and Responsibilities
- Review the business requirements
- Prepare solution design
- Develop and test the solution
- Provide technical inputs for project
WE ARE GRAPHENE
Graphene is an award-winning AI company, developing customized insights and data solutions for corporate clients. With a focus on healthcare, consumer goods and financial services, our proprietary AI platform is disrupting market research with an approach that allows us to get into the mind of customers to a degree unprecedented in traditional market research.
Graphene was founded by corporate leaders from Microsoft and P&G and works closely with the Singapore Government & universities in creating cutting edge technology. We are gaining traction with many Fortune 500 companies globally.
Graphene has a 6-year track record of delivering financially sustainable growth and is one of the few start-ups which are self-funded, yet profitable and debt free.
We already have a strong bench strength of leaders in place. Now, we are looking to groom more talents for our expansion into the US. Join us and take both our growths to the next level!
WHAT WILL THE ENGINEER-ML DO?
- Primary Purpose: As part of a highly productive and creative AI (NLP) analytics team, optimize algorithms/models for performance and scalability, engineer & implement machine learning algorithms into services and pipelines to be consumed at web-scale
- Daily Grind: Interface with data scientists, project managers, and the engineering team to achieve sprint goals on the product roadmap, and ensure healthy models, endpoints, CI/CD,
- Career Progression: Senior ML Engineer, ML Architect
YOU CAN EXPECT TO
- Work in a product-development team capable of independently authoring software products.
- Guide junior programmers, set up the architecture, and follow modular development approaches.
- Design and develop code which is well documented.
- Optimize of the application for maximum speed and scalability
- Adhere to the best Information security and Devops practices.
- Research and develop new approaches to problems.
- Design and implement schemas and databases with respect to the AI application
- Cross-pollinated with other teams.
HARD AND SOFT SKILLS
Must Have
- Problem-solving abilities
- Extremely strong programming background – data structures and algorithm
- Advanced Machine Learning: TensorFlow, Keras
- Python, spaCy, NLTK, Word2Vec, Graph databases, Knowledge-graph, BERT (derived models), Hyperparameter tuning
- Experience with OOPs and design patterns
- Exposure to RDBMS/NoSQL
- Test Driven Development Methodology
Good to Have
- Working in cloud-native environments (preferably Azure)
- Microservices
- Enterprise Design Patterns
- Microservices Architecture
- Distributed Systems
- Experience, preferably with web applications and not websites
- Experience with HTML5, CSS3, JQuery & Javascript
- Experience with JSON data structures