Integrating knowledge from various sources is a recurring problem in Artificial Intelligence, often addressed by multi-context systems (MCSs). Existing MCSs however have limited support for the open-world semantics of knowledge bases (KBs) expressed in knowledge representation languages based on first-order logic. To address this problem we present knowledge base networks (KBNs), which consist of open-world KBs linked by non-monotonic bridge rules under a stable model semantics. Basic entailment in KBNs is decidable whenever it is in the individual KBs. In particular, for networks of KBs in well-known Description Logics (DLs), reasoning is reducible to reasoning in dl-programs of Eiter et al. As a by product, we obtain an embedding of Motik and Rosati’s hybrid MKNF KBs, which amount to a special case of KBNs, to dl-programs. We also show that reasoning in networks of ontologies in lightweight DLs is not harder than in answer set programming.