Knowledge Graph Completion (KGC) aims at complementing missing relationships between entities in a Knowledge Graph (KG). While closed-world KGC approaches utilizing the knowledge within KG could only complement very limited number of missing relations, more and more approaches tend to get knowledge from open-world resources such as online encyclopedias and newswire corpus. For instance, a recent proposed open-world KGC model called ConMask learns embeddings of the entity’s name and parts of its text-description to connect unseen entities to the KGs. However, this model does not make full use of the rich feature information in the text descriptions, besides, the proposed relationship-dependent content masking method may easily miss to find the target-words. In this paper, we propose to use a Multiple Interaction Attention (MIA) mechanism to model the interactions between the head entity description, head entity name, the relationship name, and the candidate tail entity descriptions, to form the enriched representations. In addition, we try to use the additional textual features of head entity descriptions to enhance the head entity representation and apply the attention mechanism between candidate tail entities to enhance the representation of them. Besides, we try different scoring functions to increase the convergence of the model. Our empirical study conducted on three real-world data collections shows that our approach achieves significant improvements comparing to state-of-the-art KGC methods.