To listen to brain activity as a piece of music, we proposed the scale-free brainwave music(SFBM) technology, which could translate the scalp electroencephalogram(EEG) into music notes according to the power law of both EEG and music. In the current study, this methodology was further extended to a musical ensemble of two channels. First, EEG data from two selected channels are translated into musical instrument digital interface(MIDI) sequences, where the EEG parameters modulate the pitch, duration, and volume of each musical note. The phase synchronization index of the two channels is computed by a Hilbert transform. Then the two MIDI sequences are integrated into a chorus according to the phase synchronization index. The EEG with a high synchronization index is represented by more consonant musical intervals, while the low index is expressed by inconsonant musical intervals. The brain ensemble derived from real EEG segments illustrates differences in harmony and pitch distribution during the eyes-closed and eyes-open states. Furthermore, the scale-free phenomena exist in the brainwave ensemble. Therefore, the scale-free brain ensemble modulated by phase synchronization is a new attempt to express the EEG through an auditory and musical way, and it can be used for EEG monitoring and bio-feedback.