Responsible for supporting the Cherokee Nation Businesses (CNB) private AI environment, including onboarding AI modules, configuring and maintaining large language models, managing user access, monitoring AI infrastructure resources, and supporting the overall user AI experience. Serve as a technical and analytical resource for the implementation, adoption, governance, and operational support of AI technologies across the organization. Work closely with IT, cybersecurity, data, business units, and leadership to ensure AI services are delivered in a secure, reliable, and user-focused manner. Monitor the evolving AI technology landscape, provide leadership updates, and develop usage reporting to support decision-making, adoption, performance, and risk oversight.
Supports the administration, configuration, and ongoing management of the organization’s private AI environment.
Onboards approved AI modules, tools, models, and capabilities in alignment with organizational standards, security requirements, and governance processes.
Configures, tests, and maintains large language models within the private AI environment.
Supports the integration and ongoing management of HPE Private AI hardware, Azure Private AI Foundry, and the organization’s Private AI Environment to enable secure model deployment, workload management, resource monitoring, and enterprise AI service delivery.
Assists with evaluating, deploying, and updating LLMs as new versions, patches, enhancements, and approved capabilities become available.
Manages user access by assigning users to appropriate AI groups, roles, permissions, and approved usage categories.
Monitors AI infrastructure resources, including HPE AI hardware, compute, storage, GPU utilization, system performance, capacity, and configuration health.
Coordinates with infrastructure, security, and vendor support teams to troubleshoot platform, hardware, model, or performance issues.
Provides technical and functional support for AI users, including issue diagnosis, problem resolution, user guidance, and escalation when needed.
Provides training to users on the AI user interface, approved use cases, prompt interaction, responsible AI use, and effective adoption of AI capabilities.
Improves the user AI experience by identifying usability issues, adoption barriers, training needs, and opportunities to improve AI service delivery.
Develops and maintains AI usage reports, dashboards, and metrics related to adoption, usage trends, model activity, system utilization, and business unit engagement.
Tracks and reports AI platform performance, resource consumption, availability, and capacity planning needs.
Stays current on the AI technology landscape, including private AI platforms, LLM capabilities, AI security risks, emerging tools, regulatory considerations, and industry trends.
Provides periodic leadership updates on AI trends, platform performance, usage, risks, opportunities, and recommendations.
Supports AI governance activities, including documentation, access control reviews, approved use case tracking, and compliance with internal AI policies.
Works with business units to understand AI needs, evaluate potential use cases, and support implementation of approved AI solutions.
Assists in developing AI standards, procedures, job aids, training materials, and user guidance documentation.
Supports testing, validation, and quality assurance of AI modules, model responses, system changes, and user-facing AI functionality.
Identifies risks related to AI usage, data handling, model behavior, access control, and system performance.
Collaborates with cybersecurity to ensure AI systems are configured, monitored, and operated in accordance with security, data protection, and governance requirements.
Maintains documentation related to AI configurations, model versions, user groups, system changes, support processes, and operational procedures.
No supervisory/management authority.
Work is primarily performed in a climate-controlled office setting.
Work requires light physical effort.
Work requires handling of average-weight objects up to 10lbs with some standing or walking.
Work requires mental and visual efforts due to the nature of the job.
Performs other job-related duties as assigned.
Bachelor’s degree in computer science, information systems, data analytics, artificial intelligence, cybersecurity, engineering, or other related technical field, and three plus (3+) years of related technical experience, or an equivalent combination of education and experience.
Experience supporting enterprise AI platforms, private AI environments, cloud AI services, machine learning platforms, or large language model operations.
Working knowledge of HPE AI hardware and infrastructure, including GPU-enabled compute, servers, storage, networking, resource allocation, performance monitoring, system health, and capacity management preferred.
Experience or familiarity with HPE Private Cloud AI, HPE AI infrastructure, NVIDIA GPU-based platforms, or similar enterprise AI hardware environments preferred.
Ability to monitor and support AI hardware resource utilization, including GPU usage, computing capacity, memory, storage consumption, model performance, system availability, and infrastructure scaling needs.
Working knowledge of Microsoft Azure AI Foundry, including AI model deployment, model catalog evaluation, prompt flow/orchestration, AI application development, model testing, responsible AI controls, and integration with Microsoft cloud services preferred.
Experience configuring, deploying, testing, and maintaining large language models within private AI environments and/or Azure AI Foundry.
Familiarity with Azure AI services, Azure Machine Learning, Azure OpenAI, model endpoints, API integrations, identity/access controls, logging, monitoring, and cloud-based AI governance capabilities.
Ability to support hybrid AI architecture models, including private AI environments integrated with approved external or cloud-based AI services such as Microsoft Azure AI Foundry.
Understanding LLM lifecycle management, including model onboarding, version updates, testing, validation, performance review, user acceptance, documentation, and retirement of outdated models.
Experience managing AI user access, groups, permissions, role-based access, and approved usage categories across private AI and cloud AI environments.
Understanding of AI security, data protection, access control, responsible AI use, model risk, prompt/data leakage concerns, and governance requirements for enterprise AI environments.
Experience developing AI usage reports, dashboards, operational metrics, and leadership reporting related to adoption, system utilization, model activity, business unit usage, and resource consumption.
Strong analytical skills with the ability to evaluate AI usage trends, model performance, infrastructure utilization, user adoption, and opportunities for improvement.
Ability to translate technical AI concepts, HPE infrastructure considerations, and Azure AI Foundry capabilities into clear business language for users, management, and executive leadership.
Experience providing user training, technical documentation, support procedures, job aids, and adoption guidance for enterprise AI platforms.
Strong troubleshooting skills with the ability to diagnose AI platform, model, infrastructure, integration, performance, and user experience issues.
Experience with scripting, APIs, automation, SQL, Python, PowerShell, JSON, REST services, or similar technical tools preferred.
Familiarity with retrieval-augmented generation, vector databases, embeddings, enterprise knowledge integration, prompt engineering, and AI workflow automation preferred.
Strong verbal and written communication skills.
Strong organizational and coordination skills.
Self-starter requiring minimal direction.
Ability to work collaboratively with IT, cybersecurity, infrastructure, data, business units, vendors, Microsoft, HPE, and executive leadership.
Excellent analytical skills with a proven approach to troubleshooting and problem-solving.