In one sense, pay zero attention to the printers resolution information. Especially the interpolated (the 2400 dpi) information.

Are you printing this image at 18" in width? If so, the 350 dpi is perfect for an image at 6337 ppi in width.

The formula is (width in ppi) / dpi. So at 6337 / 350 = 18.10. Which is a one to one correspondence between the dpi (printer dots per inch) and the pixel dimensions (pixels per inch) across 18" of the paper.

If on the other hand, you are intending on printing the image say 10" in width, then you are placing more pixels per inch (PPI) in the image than the printer will resolve (DPI) onto paper--it will "throw" some of the pixels away.

Your printer can handle 24" width, correct? So put it this way, leaving aside the .20" border and using the full 24" width as borderless printing, the maximum PPI of an image that you are going to be holding one foot from your face is needed to be 28800 ppi across the width (1200 dpi x 24 inch wide paper). More PPI than that is a waste, less than that is going to begin to get pixelated. The DPI of the image means little to nothing to the print device. All that matters is the pixel dimensions.

Now, the DPI does matter to applications that read this little 5 byte piece of information in the file--but it is only information and means nothing in reality. It is used to aid the user of software to load and display a given image at a given size and calculate when one deviates from a one-to-one PPI of the file. In other words, pixels have no size. A pixel can be as little as one printer dot (DPI) or 10 printer dots (and more). Never less than one printer dot. But once a pixel (PPI) begins to take up more than one printer dot, an image can begin what we would call pixelated. It isn't, however, until a single pixel takes up X number of dots that the eye begins to notice. The "X" depends on the resolution of the printer.

LPI (lines per inch) is a film screen measurement that applies to a traditional offset press printing system.

Take care, Mike