The integration of AI such as Large Language Models (LLMs) like ChatGPT into DevOps marks a significant shift towards more intelligent and efficient practices. However, RackN notes that while these models can generate basic scripts, like a TerraForm file, they often lack context and make assumptions that may not align with specific operational needs. LLMs offer the appearance of expertise but lack deep understanding. They don’t know your policies, preferred regions, naming conventions, or security requirements. This gap can lead to generated scripts that are superficially correct but practically problematic.

The RackN AI Approach

We know the importance of not just increasing automation but enhancing its quality. We make automation more reusable, composable, and community-driven, avoiding the pitfalls of over-automation without governance. By integrating LLMs into the Digital Rebar platform, we aim to create more durable and efficient automation. Here are some guidelines RackN employees have found to get the most out of AI.

Practical Applications and Tips

    1. Script Writing and Refactoring: LLMs can assist in writing and refactoring scripts, translating formats, and summarizing work. They can boost productivity by offering alternatives and planning operations.
    2. Code Review and Recommendations: The AI can review code, suggest improvements, and even make updates, acting like a junior engineer offering fast, albeit not always perfect, solutions.
    3. Replatforming Capabilities: These models can help convert scripts from one format to another, like transforming TerraForm into Bash, thus aiding in reinterpreting and understanding existing scripts.
    4. Pre-Defining the Operational Environment: Pre-explain your operational environment in detail when interacting with LLMs. This provides the AI with a clearer picture of your specific needs and constraints, allowing it to tailor its knowledge and suggestions more accurately. Without this, important factors will be randomly decided by the LLM, producing a substandard output.
    5. Integrating Personalized Data (Vectorization): In the future, tools that can scan and incorporate your specific code base and operational data into AI inputs will be crucial. RackN is working on integrating this approach to help users find and leverage existing automation more effectively.

Cautions and Governance with AI in DevOps

Despite these advantages, we caution against blindly trusting AI-generated code. It’s essential to maintain governance, treat the code as an untrusted source, and avoid overburdening human review systems. The key is to use AI as a first pass, improving and refining its outputs rather than replacing human oversight.
While LLMs like ChatGPT bring notable improvements, they require careful handling, thoughtful review, and continuous training. These tools hold power, but their effectiveness relies on guided usage and integration into existing systems. And with the infrastructure management and automation capabilities of Digital Rebar, navigate and implement your infrastructure better, whether you use AI or not. As the technology evolves, leaders must stay proactive in guiding their teams to use these tools responsibly, ensuring productivity gains without adding operational burden.

Interested in trying out Digital Rebar for yourself? Sign up for a free trial.

The image shows AI in the center, with graphics reminiscent of computer chips going outward from the center

Date

November 14, 2023

Author

Categories

Tags