3D Points Tensorflow

Solution for 3D Points Tensorflow
is Given Below:

I have a large set of data, each 120, 3D points (x,y,z). I am trying to use convolutional algorithm to recognize some shapes in 120 points, to assign to 7 or so objects.

This is not a point cloud, the point order has semantic meaning, so point 1 is the top of the object etc…

input_shape = [120,3,1]
//[[[x],[y],[z]]...[[x_120],[y_120],[z_120]]]

The 2D convolutional algorithm gets stuck at 50% a little bit better then a random guess but not good at all.

NN Design:

model= tf.keras.models.Sequential()
model.add(Conv2D(64,kernel_size=(1,15),activation="relu", input_shape=(120,3,1)))
model.add(BatchNormalization())
model.add(Dropout(0.25))
model.add(Conv2D(32,kernel_size=(1,5),activation="relu"))
model.add(BatchNormalization())
model.add(Dropout(0.15))

model.add(Flatten()) 

model.add(Dense(256,activation = tf.nn.sigmoid))
model.add(BatchNormalization())
model.add(Dropout(0.25))
model.add(Dense(64,activation = tf.nn.sigmoid))
model.add(BatchNormalization())
model.add(Dropout(0.25))

I fear that the NN is not seeing the (x,y,z) point as correlated, instead it sees them as separate pieces of data? I intuitively think that the x,y,z is more valuable together.

How do I get the NN to see the x,y,z as a correlated data point?