摘要:我在osx中安装了hdfs nfs卷,它不允许我编辑现有文件。我可以附加和创建包含内容的文件,但不能“打开它们” write
“标志”。
最初,我问了一个关于jupyterlab无法将笔记本保存到nfs装载的卷中的特殊问题,但当我试图深入到问题的根源时,我意识到(希望是对的)这是关于编辑现有文件的问题。
我在osx上安装了hdfs-nfs,可以访问文件、读写等等。jupyterlab虽然可以做几乎所有的事情,但不能真正保存笔记本。
我能够确定实际发生的模式,问题归结为:您无法打开nfs卷中的现有文件进行写入:
这将适用于新文件:
with open("rand.txt", 'w') as f:
f.write("random text")
但是如果您再次尝试运行它(文件现在已经创建,内容也在那里),您将得到以下异常:
---------------------------------------------------------------------------
OSError Traceback (most recent call last)
<ipython-input-15-94a46812fad4> in <module>()
----> 1 with open("rand.txt", 'w') as f:
2 f.write("random text")
OSError: [Errno 22] Invalid argument: 'rand.txt'
我很确定权限和所有的一切都是好的:
with open("seven.txt", 'w') as f:
f.write("random text")
f.writelines(["one","two","three"])
r = open("seven.txt", 'r')
print(r.read())
随机文本一二三
我也可以附加到文件没有问题:
aleksandrs-mbp:direct sasha$ echo "Another line of text" >> seven.txt && cat seven.txt
random textonetwothreeAnother line of text
我使用以下选项安装它:
Alexandria -mbp:hadoop sasha$mount-tfs-overs=3,proto=tcp,nolock,noacl,sync localhost://srv/ti/jupyter samples/~hadoop
apache文档表明nfs网关不支持随机写入。我试着看着 mount
但找不到任何指向强制顺序写入的特定内容。我尝试了不同的选择,但似乎没有多大帮助。
这是jupyterlab在试图保存笔记本时遇到的一个例外:
[I 03:03:33.969 LabApp] Saving file at /~Hadoop/direct/One.ipynb
[E 03:03:33.980 LabApp] Error while saving file: ~Hadoop/direct/One.ipynb [Errno 22] Invalid argument: '/srv/ti/jupyter-samples/~Hadoop/direct/.~One.ipynb'
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/filemanager.py", line 471, in save
self._save_notebook(os_path, nb)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/fileio.py", line 293, in _save_notebook
with self.atomic_writing(os_path, encoding='utf-8') as f:
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/contextlib.py", line 82, in __enter__
return next(self.gen)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/fileio.py", line 213, in atomic_writing
with atomic_writing(os_path, *args, log=self.log,**kwargs) as f:
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/contextlib.py", line 82, in __enter__
return next(self.gen)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/fileio.py", line 103, in atomic_writing
copy2_safe(path, tmp_path, log=log)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/notebook/services/contents/fileio.py", line 51, in copy2_safe
shutil.copyfile(src, dst)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/shutil.py", line 115, in copyfile
with open(dst, 'wb') as fdst:
OSError: [Errno 22] Invalid argument: '/srv/ti/jupyter-samples/~Hadoop/direct/.~One.ipynb'
[W 03:03:33.981 LabApp] 500 PUT /api/contents/~Hadoop/direct/One.ipynb?1534835013966 (::1): Unexpected error while saving file: ~Hadoop/direct/One.ipynb [Errno 22] Invalid argument: '/srv/ti/jupyter-samples/~Hadoop/direct/.~One.ipynb'
[W 03:03:33.981 LabApp] Unexpected error while saving file: ~Hadoop/direct/One.ipynb [Errno 22] Invalid argument: '/srv/ti/jupyter-samples/~Hadoop/direct/.~One.ipynb'
这是我同时在nfs日志中看到的:
2018-08-21 03:05:34,006 ERROR org.apache.hadoop.hdfs.nfs.nfs3.RpcProgramNfs3: Setting file size is not supported when setattr, fileId: 16417
2018-08-21 03:05:34,006 ERROR org.apache.hadoop.hdfs.nfs.nfs3.RpcProgramNfs3: Setting file size is not supported when setattr, fileId: 16417
不太清楚这意味着什么,但如果我理解rfc,它应该是实现的一部分:
服务器必须支持通过setattr扩展文件大小。
我理解挂载hdf和让客户机编写他们想要的所有内容,同时保持这些文件的分布性和完整性背后的复杂性。是否有一种折衷方案可以通过nfs进行写操作?
暂无答案!
目前还没有任何答案,快来回答吧!