top of page

Living, Learning, Swarming: The New Frontiers of Synthetic Agency in Robotics

Three icons on a gradient background: a cell with biohazard, a brain with neural connections, and drones over fields.

Abstract

The field of robotics is currently navigating a pivotal "Cambrian Explosion," transitioning from the rigid, deterministic automation of the 20th century to a new era of fluid, adaptive, and organic systems. This report provides an exhaustive translational research review of three convergent frontiers: Programmable Living Organisms (Biobots), Foundation Model-Driven Embodied AI, and Decentralized Swarm Intelligence. We analyze the mechanisms of kinematic self-replication in Xenobots and the self-assembly of Anthrobots, detailing their emergent capabilities in regenerative medicine. Simultaneously, we explore the integration of Vision-Language-Action (VLA) architectures, such as RT-2 and Dr. Eureka, which endow robots with semantic reasoning and zero-shot generalization. Finally, we investigate the translation of biological swarm principles into decentralized robotic fleets, focusing on applications in precision agriculture and environmental remediation. This review synthesizes advances from 2024 and 2025 to delineate the trajectory of these technologies from academic curiosities to imminent consumer products and industrial solutions.

1. Introduction: The New Morphology of Intelligence

For the better part of a century, the definition of a robot was constrained by the materials of the industrial age: aluminum, steel, plastic, and silicon. These machines were triumphs of precision engineering, designed to execute repetitive tasks with sub-millimeter accuracy within the structured, predictable environments of automotive assembly lines and silicon fabs. However, the very rigidity that made them successful in factories rendered them brittle in the chaotic, unstructured world of nature and human habitation. They lacked the resilience of biological systems, the adaptability of general intelligence, and the robustness of collective organization.

Today, we are witnessing a fundamental dissolution of the boundary between the machine and the organism. This transformation is driven by a convergence of synthetic biology, large-scale neural computing, and complexity science. The definition of "robot" is expanding to include entities made of living tissue, machines that "think" using internet-scale linguistic models, and swarms that act as fluid, programmable matter.

This report explores these three domains—Biological, Cognitive, and Collective—analyzing their underlying mechanisms, recent breakthroughs, and the commercial pathways currently bringing them to market. By examining the shift from biomimicry (copying nature) to bio-integration (building with nature), and from centralized control to emergent autonomy, we reveal a future where technology is no longer distinct from the environment it inhabits.

2. Biological Robotics: The Rise of Synthetic Morphology

The emergence of "Biobots"—programmable organisms built from living cells—represents perhaps the most radical departure from traditional engineering. Unlike conventional robots, which require energy-intensive manufacturing and persist as electronic waste, biobots are grown, self-powered by metabolic processes, and fully biodegradable.

2.1 Xenobots: Engineering with Amphibian Cells

Xenobots, first unveiled in 2020 and significantly advanced through 2024 and 2025, represent the first class of living, programmable robots. Derived from the stem cells of the African clawed frog (Xenopus laevis), these organisms are not genetically modified in the traditional sense. Instead, they are surgically or conceptually rearranged to serve new functions, effectively rebooting their multicellularity outside the context of the frog embryo.1

2.1.1 Evolutionary Design and In-Silico Fabrication

The creation of a Xenobot begins not in a wet lab, but in a supercomputer. The complexity of biological interactions makes it impossible for human engineers to intuit the optimal body shape for a specific task. To overcome this, researchers employ Evolutionary Algorithms (EAs).

Running on physics engines like VoxCAD, these algorithms explore a vast "morphospace"—the landscape of all possible body shapes and material distributions.3 The simulation models the interactions of thousands of elastic voxels, representing passive skin cells and active, contractile heart muscle cells. The EA generates random designs, tests them in a virtual environment for a specific behavior (e.g., locomotion speed or payload transport), and iteratively selects and mutates the fittest designs over hundreds of generations.5

Once the AI identifies a winning design, the "sim-to-real" transfer occurs. In early iterations, this required microsurgery: scientists used tiny forceps and cautery electrodes to physically carve dissociated frog blastula cells into the computer-specified shapes.7 These cells, liberated from the constraints of the developing embryo, naturally adhere to one another and reorganize, creating a cohesive, living organism less than a millimeter wide.1

2.1.2 Behavioral Plasticity: Cilia and Memory

A key advancement in recent generations of Xenobots is the shift from muscle-driven movement to cilia-driven propulsion. In the adult frog, cilia are hair-like organelles typically used to push mucus out of the lungs. However, Xenobots repurpose these structures to act as microscopic oars, allowing them to swim rapidly through fluid environments.8 This demonstrates a profound "behavioral plasticity"—the ability of cells to execute functions they were not explicitly evolved to perform when placed in a new morphological context.

Furthermore, these organisms have demonstrated rudimentary memory. Researchers have engineered Xenobots that change color when exposed to specific optical stimuli. This proof-of-principle suggests that future biobots could serve as sentient sensors, swimming through environments to detect radioactive contamination, chemical pollutants, or disease markers, and recording this history in their cellular structure for later retrieval.8

2.2 Kinematic Self-Replication: A Mechanical Reproduction

Perhaps the most profound discovery in the field of Xenobots is the observation of kinematic self-replication, a mode of reproduction fundamentally different from the biological standard of mitosis or meiosis.

2.2.1 The "Pac-Man" Mechanism

In standard biological reproduction, an organism grows and splits, or produces gametes containing genetic instructions. Xenobots, however, replicate mechanically. When placed in a petri dish containing loose stem cells (feedstock), moving Xenobots act as bulldozers. They sweep the loose cells into piles. If these piles are compressed sufficiently by the movement of the parents, the cells adhere to one another, mature, and eventually develop into a new, mobile Xenobot.9

This process is termed "kinematic" because it relies entirely on the motion (kinematics) of the parent to assemble the offspring.11 To optimize this, the evolutionary algorithm was tasked with finding a shape that maximized replication efficiency. The AI discovered that a C-shape—strikingly resembling the video game character "Pac-Man"—was the most efficient geometry. The "mouth" of the Pac-Man shape acts as a shovel, effectively trapping and compacting loose cells into spherical offspring.12 This AI-designed morphology allowed the system to sustain replication over multiple generations, a feat that spherical Xenobots could not achieve.12

2.2.2 Implications for Synthetic Biology

This discovery challenges the dogma that replication requires complex genomic machinery evolved over eons. It suggests that replication is a property of the physical arrangement of matter and its interaction with the environment.13 The ability of these clusters to spontaneously replicate provides a glimpse into the origins of multicellular life and offers a potential method to deploy self-expanding swarms for tasks like environmental cleanup, where a small seed population could grow to match the scale of the pollution, provided there is biological feedstock available.6

2.3 Anthrobots: Human-Derived Bio-machines

While Xenobots demonstrated the principle, their reliance on amphibian cells limits their medical utility due to the risk of immune rejection in humans. This limitation was addressed with the development of Anthrobots, unveiled in late 2023 and advanced throughout 2024. Anthrobots are constructed from adult human tracheal epithelial cells, bridging the gap between biological robotics and regenerative medicine.14

2.3.1 Self-Assembly and Scalability

Unlike the early Xenobots, which required laborious manual shaping, Anthrobots possess a remarkable capacity for self-assembly. When cultured under specific conditions in an extracellular matrix, tracheal cells—which naturally possess cilia lining the windpipe—spontaneously form spheroids. By manipulating the chemical environment, researchers induce the cilia to face outward rather than inward. These outward-facing cilia act as oars, driving the Anthrobot through its environment.15

This self-assembly capability makes Anthrobots highly scalable. Millions can be produced in parallel without the bottleneck of manual microsurgery, paving the way for industrial-scale deployment.15 They range in size from the width of a human hair to that of a sharpened pencil tip (30 to 500 micrometers) and can survive for 45 to 60 days before naturally biodegrading.15

2.3.2 Therapeutic Potential: Neural Repair

The most significant translational advance for Anthrobots is their demonstrated ability to facilitate healing in human tissue. Experiments have shown that when a cluster of Anthrobots is placed on a layer of damaged human neurons (a model of neural injury), they bridge the gap and induce the neurons to regrow across the defect.14 The mechanism appears to involve physical scaffolding or signaling that encourages tissue regeneration, a capability that exists without any genetic modification to the cells.18

This opens the door to patient-specific medicine. Theoretically, a patient's own tracheal cells could be harvested, cultured into Anthrobots, and reintroduced into their body to clear arterial plaque, deliver drugs to tumors, or repair nerve damage, with zero risk of immune rejection.17

2.4 Ethical and Safety Profiles

The emergence of living robots raises unique ethical questions regarding the definition of life and the control of biological agents. However, researchers emphasize distinct safety features intrinsic to biobots that distinguish them from potentially dangerous pathogens or invasive species:

  • Biodegradability: Unlike metal components that persist as e-waste or microplastics, biobots decompose naturally into harmless proteins after their energy stores are depleted.1

  • Metabolic Containment: They have no metabolism to scavenge food from the wild; they run on pre-loaded embryonic energy stores. Once this energy is gone, they die. They cannot reproduce outside of highly specific laboratory conditions containing dissociated stem cells.9

  • Genetic Stability: Currently, these bots are not genetically modified organisms (GMOs). Their novelty lies in their structure (phenotype), not their DNA (genotype), which simplifies some regulatory hurdles.2

3. The Cognitive Revolution: Foundation Models in Robotics

While biobots redefine the hardware of robotics, a parallel revolution is transforming the software of electromechanical robots. The integration of Foundation Models—massive neural networks trained on internet-scale data—is moving robotics from the era of specialized, narrow competence to general-purpose adaptability.

3.1 Vision-Language-Action (VLA) Models

Traditionally, robots were programmed with modular stacks: a perception module identified objects, a planning module calculated trajectories, and a control module fired motors. Foundation models collapse these distinct steps into end-to-end systems known as Vision-Language-Action (VLA) models. These models, such as Google's RT-2 (Robotic Transformer 2), are pretrained on vast corpuses of text and images from the web, and then fine-tuned on robotic trajectory data.20

3.1.1 The Tokenization of Agency

The key innovation in models like RT-2 is the treatment of physical actions as another form of language. In these architectures, robotic actions (e.g., "move arm 10cm forward," "close gripper") are discretized into numerical tokens, identical in format to text tokens in a Large Language Model (LLM).22

The model inputs an image of the scene and a natural language command (e.g., "pick up the extinct animal") and outputs a sequence of tokens. Some tokens represent text (reasoning steps), while others represent physical motor commands.22 This architecture allows the robot to inherit the semantic knowledge of the web. When asked to "pick up the extinct animal" from a table containing a plastic dinosaur, a lion, and a sponge, a traditional robot would fail because it doesn't know what "extinct" means. A VLA model knows that dinosaurs are extinct from its web pre-training and can map that semantic concept to the visual representation of the plastic toy and the motor action to grasp it.22

3.1.2 Chain-of-Thought Reasoning

Beyond simple manipulation, new models are incorporating Chain of Thought (CoT) reasoning. Before executing an action, the robot generates a text plan explaining its logic. For example, if asked to "make me a healthy breakfast," the robot might generate the internal monologue: "I see an apple, a bag of chips, and a soda. The apple is the healthiest option. I should pick up the apple." This intermediate reasoning step, powered by the VLA, significantly improves success rates in complex, multi-step tasks by allowing the robot to "think" before it acts.22

3.2 Bridging the Sim-to-Real Gap

A major bottleneck in training machine learning robots is the scarcity of real-world data. Robots break, are slow, and operation is costly. Consequently, much training occurs in physics simulators (e.g., Isaac Sim, MuJoCo). However, policies learned in simulation often fail in the real world due to discrepancies in friction, lighting, and sensor noise—the "Sim-to-Real Gap."

3.2.1 Domain Randomization

To overcome this, researchers employ Domain Randomization. During training, the simulation parameters (friction coefficients, object masses, lighting colors, camera angles) are wildly randomized.24 The neural network learns a policy that is robust enough to succeed across all these variations. When deployed in the real world, the robot treats reality as just another variation of the simulation it has already mastered.

3.2.2 Dr. Eureka: LLM-Driven Reward Design

A significant recent advancement (2024) is the use of LLMs to automate the design of Reinforcement Learning (RL) reward functions. Writing reward functions (the mathematical rules that tell a robot when it is doing well) is notoriously difficult; poorly specified rewards lead to "reward hacking" where the robot finds lazy ways to score points without doing the task.

Dr. Eureka (an evolution of the Eureka system) uses an LLM to write the reward code for the robot.26 It iteratively writes code, tests it in simulation, reads the performance statistics, and improves the code. This system has demonstrated the ability to teach quadruped robots to balance on yoga balls—a task demanding extreme agility—better than human-designed reward functions could.27 The LLM also configures the Domain Randomization parameters, effectively tuning the rigorousness of the simulation to ensure the policy transfers to the real world.26

3.3 Safety and Physical Hallucinations

As robots become more autonomous, cybersecurity becomes a physical safety issue. Research in 2025 has focused on "Backdoor Attacks" in robot learning models, where adversaries could poison the training data so that a specific visual trigger causes the robot to misbehave.28 Furthermore, the "Hallucination" problem of LLMs—where they confidently state falsehoods—translates to "Physical Hallucinations," where a robot might confidently attempt an impossible physical action, leading to damage. "Safety Instruction" layers are being developed to filter LLM outputs through rigorous safety checks before they are converted into motor commands.26

4. Collective Intelligence: Swarm Robotics

Swarm robotics draws inspiration from social insects—ants, bees, and termites—where complex global behaviors emerge from the interactions of thousands of simple individuals following local rules. This field moves away from the "super-smart robot" paradigm toward the "many-simple-robots" paradigm.

4.1 Principles of Decentralized Autonomy

The core philosophy of swarm robotics is decentralization. There is no "leader" or central computer coordinating the fleet. Instead, agents rely on local sensing and communication.29

  • Stigmergy: Agents communicate indirectly by modifying their environment. Just as termites leave pheromone trails, swarm robots might leave digital markers or physical alterations (e.g., piling bricks) that trigger behaviors in other robots.30

  • Robustness: In a centralized system, if the central computer fails, the system collapses. In a swarm, if 10% of the robots fail, the swarm continues to function, albeit slightly slower. This makes them ideal for hazardous environments.29

4.2 Micro-scale Swarms: From Nanoparticles to Sound Control

Recent years have seen miniaturization push swarms into the microscopic realm, aiming for medical and environmental applications.

4.2.1 Acoustic and Magnetic Control

Research in 2025 has highlighted the use of external fields to coordinate micro-swarms. Researchers have developed swarms of paramagnetic nanoparticles that can be reconfigured into "ant bridges" or conductive ribbons using oscillating magnetic fields.32 Furthermore, a breakthrough at Penn State (2025) demonstrated the control of micro-robot swarms using sound waves. These "talking" robots use acoustic signals to coordinate, mimicking the cohesion of bird flocks. This method allows for the organization of millions of microscopic agents to potentially scrub arterial walls or clean microplastics from water.33

4.2.2 Programmable Matter

At the macro-micro interface, researchers at UC Santa Barbara (2025) developed a swarm that behaves like a "programmable material." These small, mobile units can flow like a liquid to navigate tight spaces but then lock together to form solid, weight-bearing structures capable of supporting a human. This "claytronics" approach envisions swarms that can shapeshift into tools—a wrench, a bridge, or a jack—on demand.34

4.3 Macro-scale Application: The Agricultural Revolution

The most immediate commercial impact of swarm robotics is in agriculture. The era of the massive, soil-compacting tractor is giving way to fleets of small, lightweight autonomous machines.

SwarmFarm Robotics is a leading example, utilizing "SwarmBots"—autonomous platforms weighing roughly 2.5 tonnes (compared to 20+ tonnes for conventional tractors).35 These robots operate in fleets, coordinating their paths to spray, weed, or mow. The economic and agronomic benefits are substantial:

  • Soil Health: Lightweight swarms reduce soil compaction, preserving the "living soil" and water retention.

  • Precision: By integrating with weed-detection systems (like "See & Spray"), these swarms can reduce chemical usage by up to 90%, applying herbicide only to individual weeds rather than blanketing the whole field.36

4.4 Educational Platforms

To train the next generation of engineers, several educational platforms have become standard.

  • Thymio II: A small robot used to teach swarm behaviors, featuring infrared proximity sensors for local communication.37

  • Kilobots: Simple vibrating robots sold in packs of hundreds, used to verify emergent properties in massive numbers.39

  • Zooids: Small, wheeled robots used for "Swarm User Interfaces," acting as physical pixels that move on a tabletop to display information.40

5. Translational Research and Consumer Products: From Lab to Market

The "Translational Gap"—the valley of death between academic papers and viable products—is being bridged rapidly in 2025/2026.

5.1 The Humanoid Market: Embodied AI for the Home

The integration of VLA models into humanoid frames has sparked a race for the first viable consumer general-purpose robot.

  • 1X Technologies (NEO): Perhaps the closest to a consumer release, the "NEO" android is a bipedal, lightweight robot designed for the home. It features a soft outer layer ("suit") for safety and relies on teleoperation for edge cases. Pre-orders opened in late 2025 at a price point of ~$20,000 (or a subscription model), with deliveries expected in 2026.42

  • Amazon Astro: Initially a niche product, Astro is receiving updates in 2025 to integrate LLM capabilities. This transforms it from a mobile camera into a proactive home assistant that can reason about security threats or lost items, effectively becoming a "smart pet" that understands context.45

  • Tesla Optimus & Figure AI: Companies like Tesla and Figure are focusing on industrial applications first, utilizing data from their respective domains (autonomous driving and logistics) to train robots that can engage in full speech-to-speech conversation while performing manipulation tasks.47

5.2 Bio-Bot Commercialization

In the biological domain, Fauna Systems (co-founded by the creators of Xenobots, Michael Levin and Josh Bongard) is pioneering the commercialization of living machines. They are developing "Xenobots-as-a-Service," targeting industrial applications like environmental remediation (cleaning waterways) and biomanufacturing. Their pitch involves custom-designing organisms to sense and metabolize specific pollutants. While still in early stages compared to robotic vacuums, they represent the birth of a new industry: Synthetic Morphological Engineering.49

6. Conclusion

We are witnessing the transition of robotics from a discipline of mechanical engineering to one of general science. The robots of the near future will not just be built; they will be grown, they will learn, and they will cooperate. Whether it is a swarm of agricultural bots tending crops with individual care, a humanoid answering the door, or a microscopic biobot repairing a severed nerve, the era of programmable matter has arrived.

Table 1: Comparative Analysis of Emerging Robotic Modalities

Feature

Xenobots / Anthrobots

Foundation Model Robots (e.g., RT-2)

Swarm Robots (e.g., SwarmFarm)

Primary Material

Biological Cells (Frog/Human)

Metal, Plastic, Silicon

Metal, Electronics

Control Mechanism

Morphological Computation / Chemical

Large Neural Networks (VLA)

Decentralized Algorithms / Stigmergy

Power Source

ATP / Embryonic Energy Stores

Batteries / Grid

Diesel / Batteries / Solar

Scalability

Self-replication / Self-assembly

Manufacturing constrained

Linear (add more units)

Key Capability

Self-healing, Biodegradability

Semantic Reasoning, Generalization

Robustness, Area Coverage

Current Status

Research / Early Translational

Pilot Commercial / Research

Commercial Deployment

Table 2: Key Translational Startups and Products (2024-2026)

Company

Product/Technology

Domain

Commercial Status

Fauna Systems

Xenobots-as-a-Service

Bio-Engineering

Early Stage / Pilot

1X Technologies

NEO

Consumer Humanoid

Pre-orders open (Del. 2026)

SwarmFarm

SwarmBot 5

Agriculture

Commercial Available

Google DeepMind

RT-2 / Gemini for Robots

AI Software

Research / Internal

Amazon

Astro (LLM Update)

Home Assistant

Consumer Available

Carbon Robotics

LaserWeeder

Precision Ag

Commercial Available

Works cited

  1. Xenobots - AI for Good - ITU, accessed January 8, 2026, https://aiforgood.itu.int/speaker/xenobots/

  2. Demystifying Xenobots: The Future of Living Medical Technologies - YouTube, accessed January 8, 2026, https://www.youtube.com/watch?v=L9ePEpjdCI4

  3. Supplementary Information for Kinematic self replication in reconfigurable organisms - ScienceOpen, accessed January 8, 2026, https://www.scienceopen.com/document_file/59a226b8-d5dc-4e8b-bf86-97590f92499d/PubMedCentral/59a226b8-d5dc-4e8b-bf86-97590f92499d.pdf

  4. Research on Motion Evolution of Soft Robot Based on VoxCAD | Request PDF, accessed January 8, 2026, https://www.researchgate.net/publication/334855898_Research_on_Motion_Evolution_of_Soft_Robot_Based_on_VoxCAD

  5. Xenobots - Programmable Organisms - IJCRT.org, accessed January 8, 2026, https://ijcrt.org/papers/IJCRT2108490.pdf

  6. (PDF) Kinematic self-replication in reconfigurable organisms - ResearchGate, accessed January 8, 2026, https://www.researchgate.net/publication/356625557_Kinematic_self-replication_in_reconfigurable_organisms

  7. AI Xenobots - Communications of the ACM, accessed January 8, 2026, https://cacm.acm.org/news/ai-xenobots/

  8. Scientists Create the Next Generation of Living Robots - Tufts Now, accessed January 8, 2026, https://now.tufts.edu/2021/03/31/scientists-create-next-generation-living-robots

  9. Living robots made in a lab have found... (NPR News) - Behind the headlines - NCBI, accessed January 8, 2026, https://www.ncbi.nlm.nih.gov/search/research-news/15247/

  10. Self-Replicating Xenobots - NeuroLogica Blog, accessed January 8, 2026, https://theness.com/neurologicablog/self-replicating-xenobots/

  11. What is, (and what isn't) "kinetic replication" as it applies to molecules and to living organisms? - Biology Stack Exchange, accessed January 8, 2026, https://biology.stackexchange.com/questions/105428/what-is-and-what-isnt-kinetic-replication-as-it-applies-to-molecules-and-t

  12. Team builds first living robots—that can reproduce - Wyss Institute, accessed January 8, 2026, https://wyss.harvard.edu/news/team-builds-first-living-robots-that-can-reproduce/

  13. Kinematic self-replication in reconfigurable organisms - PMC - NIH, accessed January 8, 2026, https://pmc.ncbi.nlm.nih.gov/articles/PMC8670470/

  14. With Living Robots, Scientists Unlock Cells' Power to Heal | Tufts Now, accessed January 8, 2026, https://now.tufts.edu/2024/03/22/living-robots-scientists-unlock-cells-power-heal

  15. Scientists build tiny biological robots from human cells - Wyss Institute, accessed January 8, 2026, https://wyss.harvard.edu/news/scientists-build-tiny-biological-robots-from-human-cells/

  16. Meet 'anthrobots,' tiny bio-machines built from human tracheal cells | Popular Science, accessed January 8, 2026, https://www.popsci.com/technology/anthrobot-xenobot-trachea-cell/

  17. Meet Anthrobots, Your Body's New Biological Repair Crew - SynBioBeta, accessed January 8, 2026, https://www.synbiobeta.com/read/meet-anthrobots-your-bodys-new-biological-repair-crew

  18. Scientists Build Tiny Biological Robots from Human Cells - Tufts Now, accessed January 8, 2026, https://now.tufts.edu/2023/11/30/scientists-build-tiny-biological-robots-human-cells

  19. Xenobots and Anthrobots: Biology's Unexpected Architects - Atlantic International University, accessed January 8, 2026, https://www.aiu.edu/innovative/xenobots-and-anthrobots-biologys-unexpected-architects/

  20. kyegomez/RT-2: Democratization of RT-2 "RT-2: New model translates vision and language into action" - GitHub, accessed January 8, 2026, https://github.com/kyegomez/RT-2

  21. RT-2: Vision-Language-Action Models Transfer Web Knowledge to Robotic Control - Proceedings of Machine Learning Research, accessed January 8, 2026, https://proceedings.mlr.press/v229/zitkovich23a/zitkovich23a.pdf

  22. RT-2: Vision-Language-Action Models, accessed January 8, 2026, https://robotics-transformer2.github.io/

  23. RT-2: New model translates vision and language into action - Google DeepMind, accessed January 8, 2026, https://deepmind.google/blog/rt-2-new-model-translates-vision-and-language-into-action/

  24. UNDERSTANDING DOMAIN RANDOMIZATION FOR SIM-TO-REAL TRANSFER, accessed January 8, 2026, https://collaborate.princeton.edu/en/publications/understanding-domain-randomization-for-sim-to-real-transfer/

  25. Sim2Real Transfer Methods - Emergent Mind, accessed January 8, 2026, https://www.emergentmind.com/topics/sim2real-transfer-method

  26. Language Model Guided Sim-To-Real Transfer - DrEureka, accessed January 8, 2026, https://eureka-research.github.io/dr-eureka/

  27. [2406.01967] DrEureka: Language Model Guided Sim-To-Real Transfer - arXiv, accessed January 8, 2026, https://arxiv.org/abs/2406.01967

  28. Trust in LLM-controlled Robotics: a Survey of Security Threats, Defenses and Challenges, accessed January 8, 2026, https://arxiv.org/html/2601.02377v1

  29. Swarm Surge 2025: How AI-Powered Bots Are Redefining Automation - Factorem, accessed January 8, 2026, https://www.factorem.co/knowledge-hub/swarm-surge-2025-how-ai-powered-bots-are-redefining-automation

  30. Towards applied swarm robotics: current limitations and enablers - PMC - PubMed Central, accessed January 8, 2026, https://pmc.ncbi.nlm.nih.gov/articles/PMC12202227/

  31. The Future of Swarm Robotics: Applications and Challenges, accessed January 8, 2026, https://www.automate.org/news/the-future-of-swarm-robotics-applications-and-challenges-123

  32. Magnetic Microrobot Swarms with Polymeric Hands Catching Bacteria and Microplastics in Water | ACS Nano - ACS Publications, accessed January 8, 2026, https://pubs.acs.org/doi/10.1021/acsnano.4c02115

  33. Tiny “talking” robots form shape-shifting swarms that heal themselves | ScienceDaily, accessed January 8, 2026, https://www.sciencedaily.com/releases/2025/08/250812234535.htm

  34. This Robot Swarm Can Flow Like Liquid and Support a Human's Weight - SingularityHub, accessed January 8, 2026, https://singularityhub.com/2025/02/24/this-robot-swarm-can-flow-like-liquid-and-support-a-humans-weight/

  35. SwarmBot 5 | GOFAR - Agricultural Robotics, accessed January 8, 2026, https://www.agricultural-robotics.com/robot/swarmbot-5

  36. SwarmFarm Robotics: Designed by Farmers for Farmers - Morning Ag Clips, accessed January 8, 2026, https://www.morningagclips.com/swarmfarm-robotics-designed-by-farmers-for-farmers/

  37. A Review of Swarm Robotics in a NutShell - MDPI, accessed January 8, 2026, https://www.mdpi.com/2504-446X/7/4/269

  38. Mighty Thymio for University-Level Educational Robotics | Request PDF - ResearchGate, accessed January 8, 2026, https://www.researchgate.net/publication/361526664_Mighty_Thymio_for_University-Level_Educational_Robotics

  39. World's smallest autonomous robots are 'smaller than a grain of salt,' cost one penny apiece — researchers expect new micron-scale fully-programmable robots to be used in medicine, microscale manufacturing, and other areas | Tom's Hardware, accessed January 8, 2026, https://www.tomshardware.com/maker-stem/robot-kits/worlds-smallest-fully-programmable-autonomous-robots-are-smaller-than-a-grain-of-salt-cost-one-penny-apiece-researchers-expect-new-micron-scale-robots-to-be-used-in-medicine-microscale-manufacturing-and-other-areas

  40. Zooids - AVIZ, accessed January 8, 2026, http://www.aviz.fr/swarmui

  41. Zooids: Building Blocks for Swarm User Interfaces - YouTube, accessed January 8, 2026, https://www.youtube.com/watch?v=ZVdAfDMP3m0

  42. Innovative Humanoid Robots in 2025–2026 - Reality or Hype? - Winssolutions, accessed January 8, 2026, https://www.winssolutions.org/humanoid-robots-2025-2026-reality-hype/

  43. NEO humanoid designed for household use, available for preorder - The Robot Report, accessed January 8, 2026, https://www.therobotreport.com/1x-announces-pre-order-launch-neo-humanoid-robot/

  44. NEO Home Robot | Order Today - 1X.tech, accessed January 8, 2026, https://www.1x.tech/discover/neo-home-robot

  45. Astro update email today, Nov 29 2023. New variant of Astro on the way for businesses, and other news : r/AmazonAstro - Reddit, accessed January 8, 2026, https://www.reddit.com/r/AmazonAstro/comments/186yhdb/astro_update_email_today_nov_29_2023_new_variant/

  46. The most advanced robots in 2025 - Standard Bots, accessed January 8, 2026, https://standardbots.com/blog/most-advanced-robot

  47. A Humanoid Robot in Every Home? It's Closer Than You Think w/ Brett Adcock (at A360 2025) | EP #156 - YouTube, accessed January 8, 2026, https://www.youtube.com/watch?v=hHA4-nEBer8

  48. Top 12 Humanoid Robots of 2026, accessed January 8, 2026, https://humanoidroboticstechnology.com/articles/top-12-humanoid-robots-of-2025/

  49. Startup Of The Week: Fauna Systems - The Innovator, accessed January 8, 2026, https://theinnovator.news/startup-of-the-week-fauna-systems/

  50. Fauna Systems v0.3, accessed January 8, 2026, https://www.faunasystems.com/

Comments


bottom of page