If you can watch, play and write well about Android games, this job is for you.
Responsibilities and Duties
You'd be responsible to:
- conceptualize new content ideas,
- research on the assigned topics,
- and create high-quality content around mobile games.
Qualifications and Skills
- Impeccable written and verbal command over English is a must.
- Graduate/post-graduate in English literature or Technology would be a plus.
- Must NOT be over 28 years old in age.

About BlueMods
About
Connect with the team
Company social profiles
Similar jobs
MUST-HAVES:
- LLM, AI, Prompt Engineering LLM Integration & Prompt Engineering
- Context & Knowledge Base Design.
- Context & Knowledge Base Design.
- Experience running LLM evals
NOTICE PERIOD: Immediate – 30 Days
SKILLS: LLM, AI, PROMPT ENGINEERING
NICE TO HAVES:
Data Literacy & Modelling Awareness Familiarity with Databricks, AWS, and ChatGPT Environments
ROLE PROFICIENCY:
Role Scope / Deliverables:
- Scope of Role Serve as the link between business intelligence, data engineering, and AI application teams, ensuring the Large Language Model (LLM) interacts effectively with the modeled dataset.
- Define and curate the context and knowledge base that enables GPT to provide accurate, relevant, and compliant business insights.
- Collaborate with Data Analysts and System SMEs to identify, structure, and tag data elements that feed the LLM environment.
- Design, test, and refine prompt strategies and context frameworks that align GPT outputs with business objectives.
- Conduct evaluation and performance testing (evals) to validate LLM responses for accuracy, completeness, and relevance.
- Partner with IT and governance stakeholders to ensure secure, ethical, and controlled AI behavior within enterprise boundaries.
KEY DELIVERABLES:
- LLM Interaction Design Framework: Documentation of how GPT connects to the modeled dataset, including context injection, prompt templates, and retrieval logic.
- Knowledge Base Configuration: Curated and structured domain knowledge to enable precise and useful GPT responses (e.g., commercial definitions, data context, business rules).
- Evaluation Scripts & Test Results: Defined eval sets, scoring criteria, and output analysis to measure GPT accuracy and quality over time.
- Prompt Library & Usage Guidelines: Standardized prompts and design patterns to ensure consistent business interactions and outcomes.
- AI Performance Dashboard / Reporting: Visualizations or reports summarizing GPT response quality, usage trends, and continuous improvement metrics.
- Governance & Compliance Documentation: Inputs to data security, bias prevention, and responsible AI practices in collaboration with IT and compliance teams.
KEY SKILLS:
Technical & Analytical Skills:
- LLM Integration & Prompt Engineering – Understanding of how GPT models interact with structured and unstructured data to generate business-relevant insights.
- Context & Knowledge Base Design – Skilled in curating, structuring, and managing contextual data to optimize GPT accuracy and reliability.
- Evaluation & Testing Methods – Experience running LLM evals, defining scoring criteria, and assessing model quality across use cases.
- Data Literacy & Modeling Awareness – Familiar with relational and analytical data models to ensure alignment between data structures and AI responses.
- Familiarity with Databricks, AWS, and ChatGPT Environments – Capable of working in cloud-based analytics and AI environments for development, testing, and deployment.
- Scripting & Query Skills (e.g., SQL, Python) – Ability to extract, transform, and validate data for model training and evaluation workflows.
- Business & Collaboration Skills Cross-Functional Collaboration – Works effectively with business, data, and IT teams to align GPT capabilities with business objectives.
- Analytical Thinking & Problem Solving – Evaluates LLM outputs critically, identifies improvement opportunities, and translates findings into actionable refinements.
- Commercial Context Awareness – Understands how sales and marketing intelligence data should be represented and leveraged by GPT.
- Governance & Responsible AI Mindset – Applies enterprise AI standards for data security, privacy, and ethical use.
- Communication & Documentation – Clearly articulates AI logic, context structures, and testing results for both technical and non-technical audiences.
- Experience Level: 2+ years
- Strong skills in TypeScript and Angular, modern web development using technologies like
- Angular, NgRx, RxJS, TypeScript, modern CSS frameworks and webpack.
- Some experience with RESTful API’s, basic HTTP knowledge, GET/POST etc
- Experience of end-to-end full software development life cycle and best-practice
- methodologies.
- Good Communication skills.
Immediate Joiners Preferred. Notice Period - Immediate to 30 Days
Interested candidates are requested to email their resumes with the subject line "Application for [Job Title]".
Only applications received via email will be reviewed. Applications through other channels will not be considered.
About Us
adesso India is a dynamic and innovative IT Services and Consulting company based in Kochi. We are committed to delivering cutting-edge solutions that make a meaningful impact on our clients. As we continue to expand our development team, we are seeking a talented and motivated Backend Developer to join us in creating scalable and high-performance backend systems.
Job Description
We are looking for an experienced Backend and Data Developer with expertise in Java, SQL, BigQuery development working on public clouds, mainly GCP. As a Senior Data Developer, you will play a vital role in designing, building, and maintaining robust systems to support our data analytics. This position offers the opportunity to work on complex services, collaborating closely with cross-functional teams to drive successful project delivery.
Responsibilities
- Development and maintenance of data pipelines and automation scripts with Python
- Creation of data queries and optimization of database processes with SQL
- Use of bash scripts for system administration, automation and deployment processes
- Database and cloud technologies
- Managing, optimizing and querying large amounts of data in an Exasol database (prospectively Snowflake)
- Google Cloud Platform (GCP): Operation and scaling of cloud-based BI solutions, in particular
- Composer (Airflow): Orchestration of data pipelines for ETL processes
- Cloud Functions: Development of serverless functions for data processing and automation
- Cloud Scheduler: Planning and automation of recurring cloud jobs
- Cloud Secret Manager: Secure storage and management of sensitive access data and API keys
- BigQuery: Processing, analyzing and querying large amounts of data in the cloud
- Cloud Storage: Storage and management of structured and unstructured data
- Cloud monitoring: monitoring the performance and stability of cloud-based applications
- Data visualization and reporting
- Creation of interactive dashboards and reports for the analysis and visualization of business data with Power BI
Requirements
- Minimum of 4-6 years of experience in backend development, with strong expertise in BigQuery, Python and MongoDB or SQL.
- Strong knowledge of database design, querying, and optimization with SQL and MongoDB and designing ETL and orchestration of data pipelines.
- Expierience of minimum of 2 years with at least one hyperscaler, in best case GCP
- Combined with cloud storage technologies, cloud monitoring and cloud secret management
- Excellent communication skills to effectively collaborate with team members and stakeholders.
Nice-to-Have:
- Knowledge of agile methodologies and working in cross-functional, collaborative teams.
Objectives of this Role
- Drive the product and business-planning process across cross-functional teams of the company
- Analyze consumer needs, current market trends, and potential partnerships from an ROI and build vs. buy perspective
- Assess current competitor offerings, seeking opportunities for differentiation
- Analyze product requirements and develop appropriate programs to ensure they’re successful achieved
- Develop, implement, and maintain production timelines across multiple departments
- Appraise new product ideas and strategize appropriate to-market plans
Daily and Monthly Responsibilities
- Drive the execution of all product lifecycle processes for products, including product research, market research, competitive analysis, planning, positioning, roadmap development, requirements development, and product launch
- Translate product strategy into detailed requirements for prototype construction and final product development by engineering teams
- Create product strategy documents that describe business cases, high-level use cases, technical requirements, revenue, and ROI
- Analyze market data to develop sales strategies, and define product objectives for effective marketing communications plans
- Collaborate closely with engineering, production, marketing, and sales teams on the development, QA, and release of products and balance of resources to ensure success for the entire organization
- Develop product positioning and messaging that differentiates TrustCheckr and its features across primary market segments
Skills and Qualifications
- Bachelor’s degree in product design or engineering or MBA
- Strong experience in a SaaS/Identity Space/Fintech is a plus
- 1 to 2 years of experience
JOB Requirements and Responsibilities:
#SeniorSystemadministrator
- #ActiveDirectory Domain, #GroupPolicies, #Domaincontroller migration and upgrades.
- File and Print sharing, #NTFS permissions. #FileServer #migrations.
- #MicrosoftExchange or #Office365 messaging, #Outlook Configurations.
- Knowledge of Data #Backups, Backup Strategies, Experience on #backuptools will be an additional advantage.
- Basic knowledge of #Routers, #Firewalls, NAT, #VPN configuration #Sonicwalll preferable.
- Knowledge and working experience on #TicketingSystems & #RemoteAdministration tools.
- Good #DesktopTroubleshooting experience.
- #AntiVirus installations and #Troubleshooting.
- Knowledge of #DHCP , #DNS Management.
- Ticketing tool and #RMM tool #Labtech, #Kaseya, #Autotask (Experience preferred)
Hi All,
5- 12 years of experience designing and developing web applications and REST APIs (using MEAN/Nodejs or similar stacks). Specifically,
- Prior experience designing/developing Single Page WebApps and REST APIs from prototypes
- Ability to design API as per REST standards along with a good understanding of API security (Auth headers etc) and HTTP protocol
- Experience leading 1-2 engineers (mainly interns) in guiding them to complete web apps.
- Demonstrates curiosity and strong ownership of deliverables.
Work Location: Bangalore
Mode: Work from Office
www.goneutrinos.com
Regards
Team-HR

- Extensive experience in building REST APIs
- Experience in building, managing, and enhancing backend / server-side development using java / java scripts. Expertise in NodeJS is preferred
- The person must have built systems that consume and process volumes of data from various sources. Experience in handling and managing data in NoSQL databases such as Mongodb.
- Extensive experience of building custom UI for web using the popular frameworks like angular and react. Experience in developing mobile UI is nice to have
- Experience of working with code repositories such as Github and GitLab.
- Fluent written and spoken English.
- Experience of working on Cloud environments and experience in AWS is preferred.
- Experience in Dockers and Kubernetes for deployments
1. Developing and maintaining all server side network components
2. Ensuring optimal performance of the central database and responsiveness to front end requests
3. Collaborating with front end developers on the integration of elements
4. Design customer face UI and back end services for various business process
5. Implementing effective security protocols, data protection measures, and storage solutions
6. Recording and implementing improvements to processes and technologies







