Building semantic segmentation is a crucial task for building information modeling (BIM). Current research generally exploits terrestrial image data, which provides only limited view of a building. By contrast, oblique imagery acquired by unmanned aerial vehicle (UAV) can provide richer information of both the building and its surroundings at a larger scale. In this paper, we present a novel pipeline for building semantic segmentation from oblique UAV images using a fully convolutional neural network (FCN). To cope with the lack of UAV image annotations at facade level, we leverage existing ground-view facades databases to simulate various aerial-view images based on estimated homography, yielding abundant synthetic aerial image annotations as training data. The FCN is trained end-to-end and tested on full-tile UAV images. Experiments demonstrate that the incorporation of simulated views can significantly boost the prediction accuracy of the network on UAV images and achieve reasonable segmentation performance.