Many advanced researches on natural user interfaces methods based on user-centered design have been using speech, gestures and vision to interact with environment and/or control internet of things (IoT) devices. Brain computer interfaces (BCIs) technology could make this interaction/control more natural, faster, and reliable, and effective. In this paper, we propose a decoding algorithm for controlling a drone in a three-dimensional (3D) space using steady state visually evoked potential (SSVEP)-based BCI modality. SSVEP-based BCI has the great potential for use in virtual reality environment, which enables the user to control the drone using his/her brain activity in an first-person-view mode. Therefore, the user will be in a full control over the flight using BCI system by commanding the drone to take off, land, go forward, stop, and turn right/left. This system yields a super convenient way for normal people with no prior experience to interact with the drone and control a flight mission in a little to no time, over traditional manual control which takes longer time to learn and perfect. in the decoding phase, a various convolutional neural networks (CNN) models were built to accommodate different control criteria such as the generality of the model. This proposed EEG-decode-pipeline has been implemented on an open-source data-set which consists of 8-channel EEG data from 10 subjects performing 12 target SSVEP-based BCI task. A high multi-class BCI classification results were achieved with an accuracy ranging around 80-90% for performing a successful online simulation of the drone control.