cover

Analyzing the Flow of Developer Prompts in ChatGPT Conversations

12 Nov 2025

How developers interact with ChatGPT across multiple turns—analyzing prompts, feedback, and flow patterns from 645 developer conversations.

cover

What Do Developers Ask ChatGPT the Most?

12 Nov 2025

Discover what 580 GitHub conversations reveal about how developers use ChatGPT — from code generation to debugging and documentation.

cover

Building the DevGPT Dataset for Developer–ChatGPT Studies

12 Nov 2025

How researchers collected and cleaned 17K developer–ChatGPT conversations from GitHub to explore AI’s role in software development.

cover

Lessons on Developer–AI Collaboration From 580 GitHub Conversations

12 Nov 2025

Developers are using ChatGPT to code, debug, and collaborate. A new study reveals how shared AI chats are reshaping teamwork on GitHub.

cover

Comparing Efficiency Strategies for LLM Deployment and Summarizing PowerInfer‑2’s Impact

3 Nov 2025

This article situates PowerInfer‑2 among other frameworks that improve LLM efficiency through compression, pruning, and speculative decoding.

cover

Performance Evaluation of PowerInfer‑2: Offloading, Prefill, and In‑Memory Efficiency

3 Nov 2025

PowerInfer‑2 achieves up to 29× speedups over llama.cpp and 13× over LLMFlash by leveraging neuron‑level pipelines and NPU‑centric prefill optimization.

cover

How PowerInfer‑2 Turns Your Smartphone Into an AI Workstation

3 Nov 2025

The cost model leverages SMT‑based solving (Z3) to achieve optimal decoding speed under CPU, I/O, and memory constraints.

cover

How Hybrid AI Models Balance Memory and Efficiency

28 Oct 2025

Mamba and attention are combined in SAMBA to provide effective long-context language modeling with robust performance and exceptional memory recall.

cover

Meet SAMBA: The AI Model That Remembers More and Trains Faster

28 Oct 2025

SAMBA demonstrates how combining recurrence and attention allows for quicker, longer, and more intelligent AI with infinite memory and superior efficiency.