We are currently migrating our data. We expect the process to take 24 to 48 hours before everything is back to normal.

Genre

classify

Top Classify Artists

Showing 25 of 98 artists
1

26,151

1.1 million listeners

2

17,111

656,403 listeners

3

1,159

343,125 listeners

4

10,018

304,069 listeners

5

30,063

263,378 listeners

6

8,275

211,535 listeners

7

3,976

197,155 listeners

8

David Osborne

United States

11,685

160,589 listeners

9

1,926

158,961 listeners

10

6,235

149,415 listeners

11

4,526

131,195 listeners

12

7,489

120,905 listeners

13

2,963

117,954 listeners

14

Chris Snelling

United Kingdom

6,704

114,318 listeners

15

3,767

103,023 listeners

16

68,824

101,855 listeners

17

Judson Hurd

United States

1,907

100,121 listeners

18

3,366

95,220 listeners

19

4,741

90,133 listeners

20

2,987

87,295 listeners

21

6,076

76,099 listeners

22

3,152

73,382 listeners

23

3,530

69,282 listeners

24

15,518

66,419 listeners

25

1,332

58,807 listeners

About Classify

Note: Classify is a fictional music genre imagined for this description to illustrate a speculative movement; any resemblance to real genres or artists is coincidental. Classify is a music genre imagined as a cognitive-soundscape, a meta-genre that doesn't draw from a single tradition but from the act of classifying sound itself. It favors modular structures, where sonic elements are treated like taxonomic units: drums labeled by rhythm family, synthesizer timbres grouped by texture category, and vocals cast as phonetic tags rather than linear narratives. The listening experience resembles a gallery of playlists that comments on itself, inviting listeners to reorder, reinterpret, and even rename sub-genres in real time. It began in the late 2010s in a transcontinental constellation of producers, critics, and programmers who met in forums, at indie labels, and during late-night studio sessions in London, Osaka, and Berlin. One early landmark release was Taxonomic Echoes by Nova Lumen, which paired field recordings with algorithmic tagging and a deliberate ambiguity between tempo and meter. The work helped popularize a method where tempo appears to drift as if cataloged, and where each sound carries an invisible label that only reveals itself during playback. Core features include taxonomy-guided composition, embedded metadata as musical cues, micro-tempo modulation, generative sampling, and live performances that fuse visuals with classification prompts. Instruments range from granular synths to found objects, but the hallmark is modularity: a track can be deconstructed into 'classes' that listeners can switch, reorder, or subtract during performances. Lyrical or vocal lines are often abstract, using phonemes as semantic carriers rather than words. Prominent artists and ambassadors include Nova Lumen, Kaito Ren, Ione Vale, Thalia Skye, and Sora Kato, who released collaborative projects that tested audience-driven classification. Critics also cite Elena Miro as an activist-ambassador who helped bring classification maps into club environments, pairing live visuals with a real-time tagging interface. Classify gained traction in Japan, the United Kingdom, Germany, Brazil, and parts of North America, with online communities growing around dedicated forums, niche labels, and nightlife nights. Live festivals feature classification panels between sets, while clubs in Shibuya and Brixton host weekly classify showcases and collaborative labeling sessions. Outside clubs, universities examine taxonomy-inspired listening as a sonic ontology, while producers experiment with open-source sample packs that let fans remix the classification model. Although still a fringe concept, Classify influences producers across scenes by encouraging modular thinking, cross-media collaboration, and a playful, critical stance toward genre boundaries. It is listening as inquiry, and as invitation to reorganize sound. To approach Classify, listeners are encouraged to sample the track as a whole and then explore its internal classes: note the tempo class, texture family, and vocal tag. Critics suggest starting with a flagship release, then moving to collaborations that mix disciplines, from visual art to data visualization. As the movement evolves, expect more experimental formats: interactive apps that let you reclassify on the fly, and performances that transform the audience’s labels into live musical decisions. Classify remains a promise, not a fixed destination yet.